var/home/core/zuul-output/0000755000175000017500000000000015144730007014526 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015144734047015502 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000234161515144733772020277 0ustar corecoreikubelet.log_o[;r)Br'o-n(!9t%Cs7}g/غIs,r.k9GfB )?KlEڤ펯_ˎ6Ϸ7+%f?ᕷox[o8W56!Kޒ/h3_.gSeq5v(×_~^ǿq]n>߮}+ԏbś E^"Y^-Vۋz7wH׋0g"ŒGǯguz|ny;#)a "b BLc?^^4[ftlR%KF^j 8DΆgS^Kz۞_W#|`zIlp_@oEy5 fs&2x*g+W4m ɭiE߳Kfn!#Šgv cXk?`;'`&R7߿YKS'owHF6":=3Ȑ 3xҝd){Ts}cZ%BdARO#-o"D"ޮrFg4" 0ʡPBU[fi;dYu' IAgfPF:c0Ys66q tH6#.`$vlLH}ޭA㑝V0>|J\Pg\W#NqɌDSd1d9nT#Abn q1J# !8,$RNI? j!bE"o j/o\E`r"hA ós yi\[.!=A(%Ud,QwC}F][UVYE NQGn0Ƞɻ>.ww}(o./WY<͉#5O H 'wo6C9yg|O~ €'} S[q?,!yq%a:y<\tunL h%$Ǥ].v y[W_` \r/Ɛ%aޗ' B.-^ mQYd'xP2ewEڊL|^ͣrZg7n͐AG%ʷr<>; 2W>h?y|(G>ClsXT(VIx$(J:&~CQpkۗgVKx*lJ3o|s`<՛=JPBUGߩnX#;4ٻO2{Fݫr~AreFj?wQC9yO|$UvވkZoIfzC|]|[>ӸUKҳt17ä$ ֈm maUNvS_$qrMY QOΨN!㞊;4U^Z/ QB?q3En.اeI"X#gZ+Xk?povR]8~깮$b@n3xh!|t{: CºC{ 8Ѿm[ ~z/9آs;DPsif39HoN λC?; H^-¸oZ( +"@@%'0MtW#:7erԮoQ#% H!PK)~U,jxQV^pΣ@Klb5)%L%7׷v] gv6دϾDD}c6  %T%St{kJ_O{*Z8Y CEO+'HqZY PTUJ2dic3w ?YQgpa` Z_0΁?kMPc_Ԝ*΄Bs`kmJ?t 53@հ1hr}=5t;nt 9:I_|AאM'NO;uD,z҄R K&Nh c{A`?2ZҘ[a-0V&2D[d#L6l\Jk}8gf) afs'oIf'mf\>UxR ks J)'u4iLaNIc2qdNA&aLQVD R0*06V۽棬mpھ*V I{a 0Ҟҝ>Ϗ ,ȓw`Ȅ/2Zjǽ}W4D)3N*[kPF =trSE *b9ē7$ M_8.Ç"q ChCMAgSdL0#W+CUu"k"圀̲F9,,&h'ZJz4U\d +( 7EqڏuC+]CEF 8'9@OVvnNbm: X„RDXfיa }fqG*YƩ{P0K=( $hC=h2@M+ `@P4Re]1he}k|]eO,v^ȹ [=zX[tꆯI7c<ۃ'B쿫dIc*Qqk&60XdGY!D ' @{!b4ִ s Exb 5dKߤKߒ'&YILұ4q6y{&G`%$8Tt ȥ#5vGVO2Қ;m#NS8}d0Q?zLV3\LuOx:,|$;rVauNjk-ؘPꐤ`FD'JɻXC&{>.}y7Z,).Y톯h7n%PAUË?/,z_jx܍>М>ӗom$rۇnu~Y݇̇TIwӜ'}׃nxuoỴRZ&Yzbm ]) %1(Y^9{q"4e?x+ [Vz;E|d1&ږ/0-Vb=SSO|k1A[|gbͧɇد;:X:@;afU=Sru CK >Y%LwM*t{zƝ$;ȾjHim @tBODɆj>0st\t@HTu( v e`H*1aK`3CmF1K>*Mk{_'֜dN${OT-n,'}6ȴ .#Sqη9]5zoX#ZVOy4%-Lq6dACYm*H@:FUф(vcD%F"i ' VVdmcOTKpwq.M?m12N[=tuw}opYG]2u<ΰ+a1tHayɒ aY(P*aaʨ@ΰ<pX X{k[%Egl1$9  ֲQ$'dJVE%mT{z`R$77.N|b>harNJ(Bň0ae3V#b,PY0TEu1L/]MTB4$`H6NI\nbǛ*AyA\(u|@ [h-,j7gDTÎ4oWJ$j!frH_HI\:U}UE$J @ٚeZE0(8ŋ ϓ{BpY]Q4`Iz_*2coT'ƟlQ.Ff!bpRw@\6"yr+i37Z_j*YLfnYJ~Z~okJX ?A?gU3U;,ד1t7lJ#wՆ;I|p"+I4ˬZcն a.1wXhxDI:;.^m9W_c.4z+ϟMn?!ԫ5H&=JkܓhkB\LQ"<LxeLo4l_m24^3.{oɼʪ~75/nQ?s d|pxu\uw?=QR -Mݞίk@Pc n1æ*m$=4Dbs+J \EƄզ}@۶(ߐ/ۼ𹫘qݎt7Ym݃|M$ 6.x5 TMXbXj-P\jА޴y$j`ROA"EkuS#q * CƂ lu" yo6"3껝I~flQ~NCBX`]ڦÞhkXO _-Qy2$?T3ͤEZ긊mۘ$XD.bͮW`AީClСw5/lbl[N*t*@56."D/< {Dۥ sLxZn$N(lYiV =?_e^0)?]{ @| 6+#gPX>Bk2_@L `CZ?z3~ }[ tŪ)۲-9ֆP}b&x Uhm._O 4m6^^osVЦ+*@5Fˢg'!>$]0 5_glg}릅h:@61Xv` 5DFnx ˭jCtu,R|ۯG8`&ו:ݓ3<:~iXN9`2ŦzhѤ^ MW`c?&d.'[\]}7A[?~R6*.9t,綨 3 6DFe^u; +֡X< paan}7ftJ^%0\?mg5k][ip4@]p6Uu|܀|Kx6خQU2KTǺ.ȕPQVzWuk{n#NWj8+\[ ?yiI~fs[:.۽ '5nWppH? 8>X+m7_Z`V j[ s3nϏT=1:T <= pDCm3-b _F(/f<8sl, 0۬Z"X.~b٦G3TE.֣eմi<~ik[m9뀥!cNIl8y$~\T B "2j*ҕ;ێIs ɛqQQKY`\ +\0(FęRQ hN œ@n|Vo|6 8~J[,o%l%!%tyNO}}=ʬ-'vlQ]m"ifӠ1˟ud9)˔~BѤ]һS8]uBi( Ql{]UcLxٻa,2r(#'CDd2݄kTxn@v7^58þ Ţ&VY+yn~F8I !6WB3C%X)ybLFB%X2U6vw8uUF+X|YukXxVO(+gIQp؎Z{TcR@MSRδ~+1æ|mq՗5$B᲋eY(|*磎\Dži`dZe j'V!Mu@ KV{XץF .Jg< ƜINs:b zĄu3=Az4 u5'og^s7`Rzu-anOIq;6z( rx߅ euPvIɦ7聀t>G;_H;2ʗ6 h6QװxmR JQUbTP2j˔Ni)C)HKE"$ӝ!@2<Bq 2oh80,kNA7,?ע|tC3.㤣TiHEIǢƅaeGF$ u2`d)/-st{E1kٌS*#¦۵_Vu3ЩpRIDr/TxF8g4sѓ{%w .ʕ+84ztT:eEK[[;0(1Q@ET0>@wY)aL5ׄӫ A^%f+[`sb˟(]m`F3 W((!5F-9]dDqL&RΖd}})7 k11 K ;%v'_3 dG8d t#MTU']h7^)O>?~?_ȿM4ə#a&Xi`O}6a-xm`8@;of,![0-7 4f kUy:M֖Esa./zʕy[/ݩqz2¼&'QxJE{cZ7C:?pM z*"#窾+ HsOt۩%͟A498SwWv|jNQ=-[ӓIgZ:ʆy<>cgI$C~- pr*` Jl([;FeXYlSJamK(Vw~[D@P](|˪|l sFFfЬoIP0HE9AieoC! >E mqŗ9!2Ȑfk ڜj6UxTo2@H%$ ikdNYCgsxJC=Dn@)y|tԞVT!D' !Se%Ӭ". `{gA+"KBOj]X_]\|]v>Ϸ_R)uy1ڣۡ,j|=j6-MSu;j#7v1 "x =}h=j|\Ncan +x70^;!4-ɜ^ICȺ#a]wK ě2Ȼ+\ڣUAOY 1y_MzA|SeXp%ɰϤ Xys5u?}gòki97 0j7@]D)y]9vŵ!{\0{ܪOOR Ph'«71+gywV0>R}glaM[1wBe 10Ve'õɼ3Vׯ4u}YF3HjЙkSɐCOrO+Tw"$zt^<07K!#,_,={2lVG>wI &A?6g Wg+69%ƁT4_zl50g"k P'_X~%&gY'hwdJ&j_S<.Q箶zF=_U<{>g=tңy/da>'R&b 1Mw+ǁyF!K wՁzqll{,aATG`Rp#Kd#y7I.xk*= %XK>,{$9WYM֤̺(g^5eQD$ߌ.>t8IA6b22& 2g}_3wL$zP~&c LN74Qߠ?Y@}X;ó*KY;<˥C*&%K×>| @+`^w ˨5\Jfd]r߈;J)3kxA}c]V [gHzp"KK+%ki<s^+8.V_ߴgDJ)-hp>!z"U2޽5xj#͚Dl|"$%#o/9{* R8ԔD b 6NE 2O$zI*(ƢzN폘z]aet1ٍJtCZ}^ *pR::.2(i^lbVŬ-& _of\X;rE*f`H7tyIjfp۳8K|1Z\wFN.ADoI1yGow=&7G\7I-盤>e}x{zg6 s0IH "!yf:Tm?LjFf  HDŽoAff,Ity>Z͋\hڼ)&8JR[=X:I8d]Ӫ!1@PEaSF,G2RYAuPȞ';i|7m 8+#aݿV6ycQG5lhft3EG!{ .߷*bd .vޜv:8F+t|fb.!aA`]t˫-Op};ԩN*z}0oU>˽ jɰ&-E)Xd鸽=+D$?q{ͶF_U;533X /RheX[EZNO !`^aþ4R"X0"^O%8' xa|aXRb*(u7>;~-6xA21iS9F<f [ l%۽E=A$LVwvб }7'u#T4Xsί'˛yDoWj{AL/G rwu%bߐu{q( d-Жo V VFe֦"g{,0ŊX_UQFG#6HP yL p|gV#lUu@YpA:7}f15WoK*J2=+aUeK5>DCb@=yR%92HeMc@ЍgῃSd0"cVd-sOKB<չs"|۽&M!s}ٽ&1dM@Ogs?#O%`9J4YU N dٹ#bl"-r&y((L72J>of1>R$_MGż3>{&B*F XI XibJ9SȊ$yPVs'SM}/}66 ^PMRIw9[!wZaf/+%bL_M/GfCC% e3>dz(o&Ðj)7R[}M5PnJm BRcQD$o)c/c ^|\䛧 z>9!Z>'b.gq߾ѻoIQwdC#Y P*_1'ӄNH!ZuL/}Ӿcd}5fjly/k!c!P/GM Ytj˻ԘJ ,DI{P~HjgD+\G:h! l+?ʢMۂrDEz}e:hr_Ҍ^OΏ)XDz0䷻ϲ=*yy;D|FGdp- ]O3]J bXeXacKo;0*$a6 +nɮD >:֋[@ QDe,]R1]T ZvƂcW+dύ-7m4e0ϕ{ 6K!x^>$ 4ۏ l 2JDOLЩ LkJu\!`甉܋)`ŰV28%;fHoQVbapO@B59@mޯtG^n阬iԴ.2w⠪Ri_"6GX| w,?Vo7o}Cwқ5k7:vvmi8WLfT18V 3| lXQ!L7,R3PE08kvwmSH[f@!o\IVicoM~ar.EtIbW>߷/*~qzJJ\aQ#-~`XY阡ǝRS>r,CJYXzv[ezkKA`<dkqHNo_U!*pNiJه5B5H:WztK@MR:Y5ΟUh "`"a ߒ"G̾H [nCk(O rSwvҍuA+Qm0c:QZ]/1bdæ_DQP/2 re%_bn%"s#PCoT/*,:[4b=]N!rVo%¢EN$iԱ)e\rxac8{ =CNc\E)7$%LO./!Z&p:ˏ!_Lb a|D>{N{Vt:S4q>i Ǟ/"8+MIm(_,Xi.gL'—T1ZWJPU~l!.Wetr:-DJ|njߪA!wY~ -`%/Űb`\S38WUGۓVmlTccc[O`u pb>Gȱ ҫ+al HOAi\fw$olٝ:^Izq) ٽƎDjlKٻBc5S&ڽUalf@ Ve`D~ڇAځQi5ʬL^Huoa_u` 1>Z;I"YPۊCxbIa{ sc[',:n%Ld="K-us0d/#X.?ߒםh 2r=/oID3d ֺ"?yUAmE}~Grݮ@!&H2D+vVyKZt<c&kxu7ʍ-`Zi3)|x!6%%<@fpѻK2Q 1pFP=TU?!$VQp7% 8$ c*K "U8V15> =҆xɮDپ U`w۸ہ#t#|X!~A:"W vzZ U{ TĩG /z!~^<}NY !!E%ҏ:H =VՑLf*n6tGd#fR*c ^,—R9wN?3}resLV̼d"I ve,Jm_u)d靕َ"4pj2褴ƅblæC?Fw.IfpNV Ѵ)nsX CplvD.yΕ`2d ;WBTD\UlkF瘏" i '2S-5z+YCrE~b>|Ž6Oj~ebIapul%s,igG\ZqScxTi!r}_R \5eAl G-3&X ޴-kOw:9~Z"+A<sXnRcxKV(#lj@`^RoL;IQŸŢތXD@Zu\QbRN s>U^Olz3;Q_EU0u0Kr#r|%s5Ww` q>12e_ʿd{zlzUܟlDU j>zƖݗ!0 hDԘFLUb.u6lX)"}lj.b :|XU O\_JK\?2:uGL.xllT_oiqq$Xdy-Qd)R]n4[Put2 QCppS@pȎ gƖ^̹\B~IM 2sO/I!}(1nlz1H^[Iv%clh]I^z KٕlB l5`:~Bc>q,7}VE-QW70up˳ AM ytlx)킖h?.āUEJIu-tq5ӂ~ яr@/1EUyz8ǖQqrGm𻾏-[fTsqY~ sjZ+9[nu9YFzY(R+9Xu>sY~ ae9} x. zj!Z\ļǪЎr8B*!Fѡv8\[|osk[hW+/=:1={6]I)|Өq9| 76"Q;F*04Zٚ ?V ͼr?a`57X vM)qM^d{~V,9vZV1b&g]X]]@2^^H1g76Pb*R7e [xLcUs8 cUM;9و IVAd$R-h{ eR%2~Dƪ0h~ XmRcUƻ|Xՠ60~ƪJwDYI%OM9vXi.W$Tގz|E UASVٚ:ΊۓKy/yH"cU ǛѪA /`p7nx~ϋ+6-yqiM02wЊ 5q<Ѥ5r4-q͓h,뙧;~9^iHlۇݕ z?/Gyħ0l;B!> =t |(fDf랯 45mp0C'x$;N/fKZRku H y(dH|98MnڗyiKݹK˞Z1׎K4f4,Q [X#PD ƀi\o:TYtLu\%B-ґں{>'iqJ7K7]۵_"B$T込F1Kc,%B>B%[]x.2}wT!Fe278%xx%ZWz+o&,'K_.۝m)"2rNlʐ7''CQoӔ׵_6d}mγ$%xҶɮHXc0/ ; [\X9х2o׆ӿb0;yl],h${e; 睾>;IeQI7" [rlw)(Q\WcR l͓aPS jm C߲MR <' ^$Kxd wZ\[ OQneYi`WnxV00 .뇏?v!bpL !:d,A|1)taZʺM;T4a ɢɯ.5 /^F׼ڴm4 ak$pH }}i) wE Htʚ@%-h`|jLw!(]>57' (9wX$] =uIW S)Bdxr!SSX?I!CP~Ku^Hֶ&<:=81Kܶa ۤ)~|Eػ>~$=VM%-B(VSBŤQ܂XidM˺U ⊖&YR:Q֮@AJڵR^<"h<|dfzEB,Z{|*msM#Yo D+ ,͸h> R<@ZK,Ѭ(e80Cy$~2hPP~;#Dn2}FCc+SU6JJ@mFQH"1W-joG˘ nDKx G4R Jߢ}ہ)q&` _`4jRa-OwwU>]rhQN2sIu!ʹx Z,˚`Ojmk lGӍ^Y 2ӽ5$U_0m1UjD+Ryf@@aRjI%~Aif[J Wm-( ͵ULk*x{TY$Ml+b]Iq pW [)W`1Mm@7mǖhKs𢔩U/nxkcDVIt\.BIv`j&}PQ66EkZR[Ó2Ӫ+np+GBh+)S2]u0!EslzB {lYw4l# RgZUw˛^Yf%l%uRhg8;us HhWvy- 7ʘ:a1Cu1 U(:-I#*/ϤQPεҷ\W2H >3SS|1ѻjq#3{Kæ:Nck w7~х1BA&(0FI?p5qB c %C0V`qg=c|2`&e7 X0A5fΩbaftcge.f=6@gaj%Ù}l8?fL8v>CM[759~p w@9'mV̟f!`l{Hm!ֲ:srlox1(99^ÎZc`f10sb"EDBi4\+}dEF=E N1m=)؀#˜gv.t2]Cbe!j5 Oaybax CBw`XC<=#Yj#jmCg]20V[Gf>wm q78 p$kN~h")L꒒7w̐HCE&;g^g΋~Hޝfh5^#gD`,Q4ZVbu[Wwyt+:q{ ]2I:x6\w@Q" |ObY~|/ࣝ0:cv^F0Ydm?Jٔ ==^ ׫hFw-%޺HsRD(y׷'rS۳<2W:YLa"hV ]LJnϤ-'*k719T!v?E 7wΩ' ` &NgzN!hTJksfҙC80O^x9|o )*=0Fn 8n H> 8_: kնP$bwP\/Z0=u_*}2t鴇Px_29(YJ_=A?K2΂ ~Ш9APIJ؁*S(̭%,&p_ 2 / "- 0Dtn SNQ 1$6߷*U)OϹoK\rB Oa`$~ ({E0Я̧&1_/v8'UIz#t1Rav+yP^? Aq6q 6R#M5>E^iy ANC"iv|US /i :[A'W~Г,A \*s~YR/;<"s{,[E: 2g`חP5~؄٨7ɧbO0ph_D m'ᰠ/Qq5yp<ڃ@w!zR! K8rqO a)8$IݓHp]_*(L>mȕp-\t]ޫ_$8L]1mf"B\xN}(g/ۯsMKALi#EBGU $Oc. v!iG8C4D8X,>$V>`:Wsw|CB**L Q˩u.og:G Xb%#(+ɤ,54Tٗ}KbEp^Y݂ۚ00;K)Nur!:qWWJ>PlE\0Y!Y1fpf\6ĉ_$&x-MogY"f66N <ڜˆޛ%qI%,O:ހҁꇔT QeK.+0T1XR/3i:nz;ka d\pFܐ]3lSmL&%--M;VX Q:GKZ 5ڞ4 8/#DeYش,w' ;8O%n:p<"Re9ܧjg1)$qp B)6-䨨u9Ϫp4` 3Cu|QI1%=tjnd$qlȝ!cl5D_p GTibiqYUq`: 8LGl%;Lu̮q%1[ "0'5%cݣ9ˣf(vwG]ϾJ,3[Z#&Jfxn⪹`ՕnǪyz LY)!6w0*'gg nOh3?~c'x2yA8Ju!Cm\ TM` : $c:.<,I6s=/h[lDz#]!}pMB~y~B}V6ğE*'cШ̯f""Mi Xm ƺjiIoff@Ȼ׊۴r{ ̮`1brDAZgM {Fԅ],ǟ:&&?(]vBύI.§ʫy/׎ smeR"W;Vqwt['Ey.`#='W{h?tщ^{;ypdܖWEyK]}єNgÃ|R_;t *fl}G Ƥ% I{q$+CCm{/-CmUo`lzcN Ncr"7a |>v\o0?p'pA0c%]l\9Q0e/ŨQ&ЧdX0.mJvz`Ǫg c?tЦ!|&Ŭ̶=VNN+uDם^8?mxOl$42h,ꊺ\g9j+`gy 6!ЛxCE**IT\I \&(r:x0r/ŜGNU!b" ˴ThjR&}nCҿNg ;. =dt*-}SZͣ,OˏUSќEfeV,_ʫR>ɀφ{4([ ݣ^8z1\Q nbMU%u "YY#,T ;Жi?qϳ*MQc!h̷0r\}8c 9~X-CcjOe`hHoaUas=Il $4q nYDyZM{Zzc~{~5Ra5t +5 PL[)| ɟV8BYH H]?O,-k"綔2£ vE3|e`O'b c\w `^(>)8q]pw֬HdV|1wb}*-`V+dz3xw%1ē 겘b|d Zv{xkZUQO IװɮabD<6tf`8Qv덆gzŴ2}WlS&<f7zmC%XH1 hGj4ɺ l?3 6J v,[=z郂`za<*,~ͮ2\[vvc{Nd{ΪZ@59|ͧb^w{VQ\S=QP$)F`0G]vtLLT3M]9KfrB9P([!:Jab;0"CW_ywRU,p:Xt#j) T\z 1y.2"|AUHb#sJʚ4X`оnְ`=PAИ+Twy,eH ͹\țFBLK>,hk"# =v~9j91})]2*|Q1fjq9>N!5 t7L5+xg~Z0釻5.rZ˝O}'DzwX^FE~͓( xB~[nYt96r@BiHeewdt2{rI\{^ ~TSvH&6)修 tNJFZir*H;xS*GSv5Ci8,?xhR?X=ߵ sO1:akx✗5>_Dh51\پ\W ~'A־ުCU Q'8-GӌqM*5 ׂZ$֙<-!m}☭]U 7VcqЎpSU62|3HZjcH tV!$cDE5HϻO*30Ҹ4 v'*3tƒ&P9nI !TU43]hJ3Hm 5NδR\Kӧ뢸DFsuÅ)YKl)) pP_e犵R֊R{5mLs6ꐽig-ƀ OQmCl+eaA^G_gف橔wM%|C䫆;i24p6TUYjT*5/g]}w2TG$C\[;"n:"UJ׶?q~i1 ,/phGr>-S:oww[V`;!mše^W;? /+\Y9 ̓bNuC̰nh* 'SY0P.!,cTO#[H6 ǿ^uňf80+;dmY'rǀҁ'y'+3B9xb^*ڨSOɒsM9s`1A σ aB+ vs}Z:8L\]N*R=ݎq&b"a ]t ދi: %*#CSߴ'򾻖Qc3.M`&xqATBš]=6X$Tp >6%:7vm:b2+ɦ 1&D,S#8\4H B^TgjejD(A٩+e1W)\k9/Y* 6賦滘7+b4]gഥc- Ra k+YB!O4ьv4l-F:W򝭆#-{OꪯOOP8>kA& ]@qTݪ/K|]Ҫ[O?=vǂ@\ׄU5Vj_4&__o*cqcDt!AbKaDk8n>bİN(HC3Zƒcܻt =KRb-I,?XZpB7PX2)&LYGF zuc3Oj(\FU*QX. l&f IjhgKGgq[ ^蘗0ǭ3 F3p[N f#+De{'ZI2k'Ilu7ݳa;#lP$u <)Km, /mBD?!eC`7{8!ͬw S3% UȅXę$j=PP#$ԖG'; Se4D\I8N$8z$~E#cӲLRsY#xL,:3;j)8 ] ~5XH})|6F!{GItc> Pg:̒[=EFaİqtl&qΏy ZA鍵LpUD$´\I3˥d[hLdbdpn#č/Eݍ|ӕeQi2B ?!S@=J ʍ`@a+LhPĔ6n 70X~V:X߫\4UBzV-75>F Ѓ (H|V!y=ރY%cQf0ķ%\?7;&aUx\lvifF7 ~L WܭX҇VRvB FIY?]]huWc9tsw_%7VpLM'`;e07vFEMƍQWIIscC\^Bcۘ0U[D1N$eAY9&R|M{r$8e]3k2;&2=g(21eČgzW& oͿ^?51> !Vg-Liux }ߞDTbp~lE:%`:KBhdrJ8|_*r#ԕ.kL UYj_XZlDHbۇ%]r5_ҬwCT7ݪ\* B:ś/| n]\N[*egZ9RȞR/D#8N]򷳸o-d p4|p[BuՊXwY׌P`$ѹ굨kWDl F2~Kfn{.^:eE∐t. ҋ*Ϛ-F⭼^o1xyY f$8?֡$ * حN#ydh\iZG1B)/rpm\<#878 0JYhD(F8{C̅UH*TL @@\E/AM< ོҴR ,`$AnD$1rp湔bx<7$8&Y8wp'l2j(ݹ5@:nbz E {EJ[Zt`$瀆Neĩň!x0.#߯5Ǿ_(=߻̬q<蒘U `RjVKz\z13o]lj XQoK/ yY%~`lB萚tZRò?N})RZ4*8xx٧! I"W0u!:)F٭{}fvBKjVD14L0,KTQRIH F egHptfk.IjI6Ʉu [P1lz=\_zck74טX^M,H#N~n#l KZz$)`G. f^5CtIjAERi#v('IJ) ^flڜ'q,*y:RO*lRheqHf p|a糖 X΃_ʊ4䣷^b4A!iFʹB&V$Fu$8e3x1W'VRHp = mtN HizXV7ޓkHz Caڽmn7AT b\H܀sРKR6l:tW):hiu lXHH3yk41ͨ0mL'twE?>8zDc4;QYLpn ٢6B.%8L< IMq F%b$/_W{1H7@Va\hUjX*5ìC!F<@pf:u%O.{HI9mKct<]C %3,8n|sEq#%D[{ C2]~6 1%JCNم X[CN*YM /!St!p2g0}}cowI]H:񾷛ݮPX7l[-_H.埻uF@J災njHG<7ärSܦN︈5ՠvwwta4Ig Qca=/qjNwYwYW)OtxWm"ЊE\?C"&{n'!|K'Oh#oK;xꮒGk\oz=f*ay2~n<7ሉOj ; kP9%dL+^XpQ=5H v ےNpqMskwL 1>:ue]"熍q7o@C*&:f Hɶ U!r_m7wx$ԁ\>w.;P29vWxѱ֕)ZY72`@exCzcJnAJ>qT$$!It_^Hp\>̮IN N:mzshԫ\iÈy;jәJnr,DAg1.S.w/ND 9luC1dh<+mOݣ+ۧr3Gެ)+՝مNpTOc,۵^\gI8-d]D"wB ߛ~c4F`4'1wB!xqUZ:;1J㺻\_螏&;pe*Ou1C=gSzBAkFӿ E;2Cw~Qt:, GX:J^_Vw6Y }D@ooj1"gڼkM^tp4GK6y1G1<G}^_:)YۢHԢHUUshKrDq2/o>d܎x)_T NҙR^G |Kg}f96tx5rY7L%C0)chy3̵ƍ^}ﭫpݙ pwo8/ghҢ=*>f=]7OS @(5 +O7_㓷˿DUUpGͿ)ѳIUE;?B.8@K4ѽháςn2eu$dkm6G"L1ZPVהE%ɘWEr쁙3I_Ǡyp=8ǦsQ1@7b]c*K7 ,n]E.)ȕT>X]#Yfw۝r2|=y۷Ѱ[".93Spj*Ѱ样A&΀GȀce$e…@f_1Y O;Y|Z2iF~19qBy b/"qy6tE!I@l>dվ)ƽrXDrKǵǜtX E,O_S .ݮ)e~^;&I;ں%*f$Dmfd˗02SDZtsPpZ*~T2;twO@%r|,lxz߅;M6(8=~ vG.D7(rF׈9RԔ/U4i)%Sw $ nFK?|w$=޸By sd]-Zl Fhe6GPpt7r`eS~M3cJ4J60idɮyj+Nc#AFqA j7H R5n`&(4M>5cH5&\&I y, l74jn7akݱb|ΛTs DYD\6HJH$qsĒ7FMB3$TavҰ4"W Wp6j*xگ[oIp.))-h0>Sc4QǙEcCDCKYu (m?`J!ec]^pnv4[_k0Sbk9k1u^G%D+ j!W"_0Ko^wa\ gR6oQqvVϽqXՆ+$&8pͭn7ia<b/;<;uab,a;nt4kj A~1{\ؽ `"t aIŁ~ Ҝr0O.kLqV >IŲxw΍3 E4߭u~&̑y2͚;=وs X1rR<&h 8K4> >>罾Z{m sk1BczMSry`)O1dG>SペO!Y/vSÑ=V9˺و`&g<\8UO51CQ]Г)f]Y=O"Pzڕ0"r-W{?&*#p΍~ُܻZnY9{;p?7\og>9hUun;@?'VƾwE0WqFB#~cɖ]Yƾ@cX݃DcH7-c?{z~sn5gjWS̵3>~=iT0qՀMr {Ys6l R34%(5g`%^vBo3 }vdmghFp2 b\*jdǒ+Cw@jvdW挺⌺o.%7wyF'UΔyp2z1 (V8mLg:kPbE)xzheFQ-FAAU{pBVŀPN;0581G߯Ӊr)Xu;u;ߜ3p̦e*gl""Fi@XR'TbpF Pc1MʜQZQlɅ`jO6V6f+!j~됩x"X>w=r奔+ryЏ(c>0aZ9&Š3BRJʛG\x˳Ӌw=Dqm;$yaάi( $_ $D|/9bFbP<*Ycy1}T +hQGq10*ti~Q4ęZ|crLQD&,)sӧW_gߛ`%qu_i>.@ݔf;^\_ODq&(|OzT%GGVfyTmN$gOlg7 _un6:}0.-l L/`5k@==qxQO(v0Ϧm< r+8ꍬD_oڥ-c=bإ2^nwHpR=l}HJF[r,JZ7S P~ږbk%U˯הn,TB@U7] *TkGg\YͺѐuМ3 R^ӊ '>PZ=4 JܻTE減cNwrBW&j=WW]~`,M|ߎ{>c|-hv5Et퇭g] {~`9=Sg7S(+gX5 (f>䯻榀/227F%.54tT21U~GTڟV4$o/_k^|w\]zR7t=>;%mo~|;|lmGu9Gog'e;7gjr2x яE[Pϟ^~nw 0P棟mړ@o5_Jq+uN^0!G?-4¯h㣐u2t4<smxD;\[7GM܂q{uȇ+MJdL*5in)q)DQ]bEl‹gT~(CoM2I%/j.Dx(y-둫p j9Fb;oi-ŬÎ1M0G0{q"a%)VD% )%AcmIqnh%ʬH:(_p*U$͑|@.͑|\Sj" m M5DqǍN01T)1 ѾLp$\ljI~l pfPTµS*HLMѩQYF)!aCep&qqϙRAQ V'bf%dyBcK$`",R%j)[1v; !F @%(qKlOabtKƎfwA7Κ8Q:lu;-݃1qr;jYy(X_+ NV1iHR":AN`S378#lG <#F N{ lMVb"Ie#T}`VƯ3V3UҀX:(&\ #JJ{c% @'{ech S_@uTWި+ _KG1 8 \ * 18__ha86S ȿ_^WzA^^WMWQH}ƅ*9G\ >AxN'1EI㱧FX:+,O)D ,7ȧ:;;LÕSlpN7tqo-ūM􉗜x+ Mg4HZ+1;hOp?}8;5 `VPsc()#\.hWͭUX#B+ =}Y)mԬ/[K Gr0}V=Y`CP.zŅd}V~-5IedWr'i ""Yj0nsdM܁Ԅ (cTC_l"  8V?3)L޶(/WoN g?|)@HtĭMtk4^ ^vR1!i@ ʜ-"$k*qJy))'hQڛqX"# =:ű# D blwyyBbNƂ:.q)^Cɼ(x_:/G5lw&RTᶥP:K=,M#8ܦ„^jSqCq[7^䉚2a4Kn_ႬgDˑ/1 //ޯV98ᩎ[.e X]V`Sɶ(I1JT /%F0%Ɂ@Iٗk7DюZ45kft\\Q @8T|#TaןZ*ݣ+.{܎KL% -Ecٟ)5p0%7R/ W!H4|764tl.붱XhdR|~. 9OCK>-ḱK`UŻ-`V)&`7֜\4Xw]ܮKh&Ԍ;c,}ZwYwƒ k-4_@M 3Q{A7*EJ8SE) 6YmK ͍['+$>q4*##kV.DMv*?HaҋMv^Ym.iL=k@5R1&Lp?Щyg : @eF#-y{5`k2]嗚ɌmF3m:&RqaVy`(+ݺkW|+wu|d=6q6UZٝjM#͇|cU(+ScMC/n8xq68"ofP1d$5DVej6Dtqs+Bv\\m'FКsI^];8g"˱G'uV%I >}?Φ?or3>ٕAyt}ptk $rAࠨO(W88i6oI[p*_?ѿݐQm2Q!jm#0/'9b#x 7ًQ}F`g9Sؼ{t ]c|WeՋ6d@4JSƲ_}~u#ȳ̧߬V.i6l+`ASl|Oƀ2wQe厈z^H'" U#!Sb/9U gܭҀeX.e{]֑*q5Eμ,k$(X8ɈR9o*v4{X$;Υ`o2L쮼ZK㩙ۉcJI&Nrbu?|V~ESt $]\O#s_%OK-ŀ<>xlOG=Dv *f䦳l}<=e5dƂ/˧z(3IV\?l=Kn\^M)8r;~[ _j+Мr\rb;#O5K+:>YWZAmU8xwUe`Ϯ+X,[^fV2?YxTvm a:`KMm#0ryyjyu9 I뒛qKMƿEyWXeb#qcmUr5咎/ SCNta8˳,5 8w%<l%{Xc譛qb'$i1cH64~IV8pTq;K-.pQjّgc' N<g-"SGr/]|.g~Z,0OGF" sкsX;jEa4WdcS,2Y)VfNUN%-hrr \\~]㭙\G*D-p2$t6NfZ@\UlDb| `277G;C hQ`UfU5|>FV9 آC'\Ԅ-ZF7Mba|r5KhI }mQFF5Cދd;m;kH $(ODxcjߠkvӰ~`=MjQJp=Ho=τU6 cg]D9VPI9 Кxc(TWglR,VCt`L{ڒHRiSW| |5\2SbvSn&y(SesJD|  hʻOXʟo4lܪќB\}̳?(N&, jRJ6MWXܮ*NϢ,yIIh׋&//@dK 3UCuJKRuؑػuTi}qɟדmkh3J0&"l|1A ^/ƸL*YB*>@%:u& 9Dv~RYv˾h5*X(ɈEVe$_yQD=(ӷ Xz{w{m}BMwWr$L:e$r-{,\PdZ=f0=wMǿ|<ƾ!(#U 7U\#h^cG-`V3Y'6AiY_N?1b&0/݈TimHpƥguF|sE1"Km d| fnǕ?~oʃj.)v8f.sY V-2\'X!.2͗On1/@lzQ>ş'Vܐ `HffU .ܦ@I.j %-$fOb0Aoy妁Xh΂CnV$=k8 ~%|@RԜԺo RK8c'r$H)Vi14+[w6z3ii#";H-ps秎Mxz+\oDHwwr+s*-¤ba_c1};[揬Ϻ B`jqvՈ?ND3<p[1z ˂[_P1bJg/m&#DOI9B KҐk]LJD.qn^{^02YI kҹqhQ|S'Vضь6ђQ̳PpVK0'egZG_阓']ǞwccTEQl]P'Tߦ5LBVfV>-rCE7(vE\ KP$g_b7_mi>̻~lzt_w3k}"z[@>FTBޔ!SNo, q3~mh1|+gp^XA! n+9wc^."kk׶_wx%6$x^lSQ_2_/bϵ~?49\ [g7KeȐ5ưHЖ wN5-Voqo. ׾nLQh=)h(NJ,@5*N1+'K^s}<,w7Q9#cXIʾqs,Oth>j*CﳥU-}[|nWM3ß1?@sUx#ٙr+WLJ"[Mn7>s4`׮>Z5>u)z41,V}M_dE߶_\S\@B1xY$+TH[#Axl O W'B~hf+~/sQk"gFl58y:|B@ # -דww9FQ6"ΔBxRpFse8@сF>rSN,$ϙF$H`mw"-\D"OK=ugu`#WGrݿ& ,!NB >@ y1|,L-&ֱD>D\cPd@<F>T1"/+\s8/ēH󐐈RcE XU}Z9 2NK*:!@.{d5%Di̐;0l^=oNd`ylSKtM!2]"D>x5Fz#;:xG6|,CŮwdctyg you)"I)~%7#e_m̾K}J]8@сF>r]N&@G"T:P8[TcdfD!Svyw1| R NV+ w¼-n4fMﻦv-˨&vknl0?bq}> Ծoi/;ʔV2IxfȨ :Cґ<=bAc?3]|H:Bʇ\q`6)8j0R>M@eKGϺ*R⣒X/10r3>~ b@dͪl*SANl !\D2{‚Bd"=2*dYi=@3]kAoYm"u|Δ;̪x`dU=;+qvx^40ȨٗCϋ]!u,J[gq"r)2D Q)*dދ=xCxtěv[vUJw31lcl$V+Һ&]D>tZν=f;pqR4)ee|lI`$o.c$D!c%I?{dTWL巕|pEmb) 6-r)g,^1TGr0+6]opAY\=2*NRԓ)'9ٝ($a9 XH3 yˀG*l/^_6F:l6$YuE]M܆(^r0򱐫ž0. DB.5F }=. rI1{Cy O?j) I jhmM\lA8rԨS&j4+|T\81C22X熔zrBRi1`f77 򱠞Tu?-vqS4PHso& _vl֢|^9M^+I0jnP#|2q> 7YsS^柏NYn"Ѯ" pA[/4˛o+Y}L> +%Mu颠Mݗ.Z6>kuԘӊn= 7HOʨygk&K! A8kGFF%[F)06EtEH⽔֫!w@Y>J?<09߰;X-_߹BGFSРPFIAI8?dY[,^ѓBXt ސ-e])1C E%1ơ& m堤J0,<F>T V'nOӺd/܂Aᄆ3fWl ÕsX+wq>.TQ4[zKԠ 77|,tTSJE˝zm'ݣ c4ͽFY k Tptk0Xm9]c|uY%SDlųB=2ڔnJ|,z|GDneِYRgn[f\3m1yrrBĜ"EPm+i7D>vE v5+lE*#] V.ET_Q] ޣ cu}k55xA;!Ь`NP}g zf77[w0Q:;Av4b:wKk޵<ۯ"DF U<Qsӹ Tr~Xc=dZÆy~G~ (#>%ՠ۽+L9<}W(Mt흐Î`` l1rؽƇ>jB[ n[C77c7715>dEٌŏsOϼ7±]3qq~2TOHEp<.E5I]jA"[铛'p9a`cU;#; GBܷng?$Ak6ɤwю{Ԋ:#Owt h#< EWxtit4|/@_2v@o/AxpRq6w&7=_`zȸ?ؕQ ︐b+{V1l}xEwԓ_\YQAʥd,=BaÃCwpQ q*~vݯg_L" {ae)^F00D>rC&"#1p4L\e_/XtռwmJOW ()̍F$4iM0cE=jvv\cQӷi4ݨٸBvYG,Q ѰLWi8iD qm-n-ص(؍i36!@NOo͎aĒ G謯R5[hSP7Z : |eAZ5w1.p2hab09I>m<{k]x[" Y_7U/hkj'ݿlidfIAQvY4Zk]k D>r v@XA 7lƆ[@ WW49)e_d*håp)|,j:wK8 GF; /8JB? aDlkJsk JJ JW8IBH+k x4$[GF &;BӍKevxx^څIQ0hU (q˸li#T9Чo /FA{-:$t9$t/fD(v8pH!Y(7krZ^hmm " Ql5QkQD-v "x9FjY|xPrsF+[M8Ry<1(GFyd.,}^Օ%(寿qQPa*Q4<c5 }]D,D>57O/a;F5Ud\d&Ws\ygc46L)(_zĒ;|sXh' ń{dT\ݔN4_pA5 y#V~] eP\7! 3DuўD>rZ.P=#;&M/tY) 3ynhD>ςW|6؂#k-S9} Z>f$fCn ?g^5WS?J;߹eQVwNQm¸NqceqN vǢKClgW5n{Plw#9GrXUAulwKlV~si wvڋj@éAhHhU$ﵔ-tD5O2A#zd@H.y T#R {Ԃ-# ś=#jl|PgF>h^(`>5!CG~i ڦ\nϷϡ|vXB!Yroٚ/U0r#[YLm4Wz]]b9aes2yŨ1fBRo"^i^xtAc!/X`Wq[YҚ8Xkz[EG YsamXE~VAm 2FP A$֖>X8"Q䎎*G-k./cޠ*eNs0+Y=_5?mY}^kFkG7[/׍:e|׆S Gׂ*ZjhPk;эbw|=SsjGck~}v@9W1R)^RE1w>0Kc_vXE~.Q0^H[|fzm<$FU7΂f=삉昶毖m_s-gwvڐzO4n5aJIG-}qE~ȥJeY-&+2UjqF;NxቦVNXBNzG׬;0?yRo%[#Eҥ|l]|Ĉ>vk˨|ļe#1T+, ;'UPVtH´={t<Y=2v{4#yQ|jte^)E9RE,D]*05 H}O$"=PJ\QS;>W9N/{0G7ZxZqMkC=j@ǘTD: ;V=OT]Dc șRjYhвnthM;FX5t/xΆÜU"d|SVcNis-;wخ9[rx~=c7D33:Wv a bX)ߐtCqhNi3.8מZ ZIX o* {_r*^拇M8l5wycb8WCg˯zErMke\Fd?gEE.q:>u\G1*+F1r=" sk&A (Kt\ۿ-ͱXz_ !_;àM94jL{#wtueÒ+49N[v$$R<(L#EZΧUիK7சӌǗi*_Xy) 3J%+|מ5/CZeq6l9TNmb2oP0x\~1y\{{XpϦ{Y9M]@tEt39GP`\mR>"|;̱6UIY/+MIRI|ǩolۺ_w~8r 0W:SxIT&yقF6E˱kO/q@?zcZ a~tx=9hpg"M:Y3"ΆE;͖}Yp܀~U=٘PgOZ!{췆/rTqG9BlN`TaVg~l*CZ%[ wƅ)ϗioA9*H8)HTu&p'8ӵ>{>OfKbca8+݄dYYVSd!ʼnB'Ů|)Px-<>)z3"O}FFFWT"=,ܳ(h(.?@h6BOb;oq0ȘRg/)|<@]ɼ^ShZ,U\v~3*nsEv1 |?@g4[?|elOXnя lfl?͇i*UUdMd *8RCX*ɃI\t5YdA9r>Wwpѕ۲~讥 F6FWas M"KejtѡX:hTM"&ǚp jҐ`iEGǫ7;xJZ_I{xP5-<>G9U}{fpĿ= Pn\kp7^0zjIDIۛe?J*8'@[Yƴ-%|n.2πZxM|w7e8#)Χthr*?j= ӈ8U~\Y3hx(nNkbYT^XKy^ѫp 󥬛|wjNkyXg!p_ȋC7d;"1UpJ TҖ*jxѳp"F9K0b1G7vTAmH4yjj4pG| `|AXL8MU׺# ߹g= ܮ9;9E>v_s盠 0AY@L=dM`Wo{Ќe4# V3=jY8,0Fd5Hf͸@4La?kkUtšo?*~<;ҳ{y 6aQ[x,{ɹ[Wv5y ZR8[dWwS8ξ]UK@BrHEGϭcLd lq{ j#EI `wdL*gU2v>B߆xo}|ЖCh`9+N 83Ƹ;bk,q9΋ uP7CGWU Ng"t&371Wء&>v:v:Cay+ N$%iVUZWl{ QģMe*瑢WW%$|s->{Qmo (j 2nk.UH~2o^AXTL'1P8dwgp1ĭwM( F$4Ɓ/s%#BbFjq6Mu5/e TDYSߞ=7ڝPoQۣ"ƺ3m b/HIk7 <2$?MW{RO*ZlBᵇEe&K@@T2~GPZlK K1\y#g$ ,;y68Y3ƽrrCڋiv)Tڇ*01/++`R-&[k #K;Tj>]D;E 1m13-?p*ΆzW}w>clcDm[~#3|-qk1ϡ^Cܚ A qnl,no56?2g7RiTfـraرȡw5tХBowT($dt"! MRmԼVNHӟn!PqQqWQ^qKzWh*S *p֡HN6Ym ~J0T8ޫ 膺m 'G̫G~mڵ4/00ygMqbdY7nx`wkjh.ܓZʣNhz2wTDҎ6~7hC. 7ocC늕D:O xO6TJJEG^Fu9S[HW$J(9&~f;cVZ/!vtx!<Z"9YbǙܰr/=͞9-x ~MU5!\b]APebD t#ރ&a$ƫՔt2,W;(FlE ?ʷi@ZwRj)Fdu>KU8@q/'58Py`1Uy 1xN^cV<ģ{ 1a iNTr JWAdxK ' xtsx܏ű=OqP2۰!kG=UT9q(\)jsp؄7iEyfXK d\AHy|N/֟(~Ծp-g5i?(ܕ`"0joC?Y*jne{#ءXq.UPNPoDռ!]<"?\d[)iz]V_iZy2$Y9IABєך dj{1XbCܚ\-+9 swSĸi7BYSkMsU*Fik߼`*RZ`yK(5.:G;Db(6ATpYlfSyN|c6Tֵ"!J{m ) ?yhMq{n+aQr&*y߫x%.m GVRA=~BߛN.P`[b=e,0,SW]Q? u~0P[[Ay_IIэ|Z4WNȆSsSe.9D1!t&<|==2<&jr!ev)Aĺ%!nzbTWT("y-%`0/aV<<_Og z8RmcZ#@ iϻZ#7wKb|zVeϓ*-":5[\+O'یƚ6.OZSOwLSw&+-L8PNbӵHq{ ~,OKػFn,Wc݇@c7y ɼt# ,[,݋=Te]i1:o<<7~Gf);ZO#b1[\ͨmԥm:p:Z/o0m |Za^oZUuFèô!,FxVM64+QuFhVp>bMW=rU AxD9@M@! 7E(k D i=R`3{v/?P@5'מd(z؄%dQt"0d r`)9c@W[5 U/6Bf8GY'p}7W|>]ތ g'ly=ڮ\HZ8E}Ek62cb^ĝĹg J- >"$]B|ZB8ϘğTsEvR:+^DIs2kE`q\Q((S a6o76ݧP\ . `9yM3i]R2 ؤ:-n۔٫la Lɘ d32$ q,:Tbm~;kJWꕳ1٘B\N ! !f'9F5Ae^>1٤{xqz=EtѸUwr fNhi,ߌ]. Gk{{q1{l5| \%ظnE >=[|z!iƷ/Bk?wChP"[VSRhd39_jX3dFu[XI8pŖ+vAϥCM;_'7-8o1$LJijn&{ӡd?yI 2{ɑDD2HO|Gzӱ_k}fW3J$;zsq*d6qpb1Wq:cBZ $yF.dt;~5*TNU6QņLJVl`A?+&;H;vaYN5 iS9!]i? GS\P^J0CH#S+ d8^Vz;<:Fd}Ra`ْl5u1\9,0sNE!G06B|GZus1ͬwρZm۟T qH,CQ hYV v(>@մWn%XB͋}Ϛ<82֎RMĵĐ,7*O!>kt^Y,En٥a r/Bxc<T lek5r%\-6eX sFљRșB;9-dik)xgIG^| tڸTC:b ,uěXX(4z,̀- dICbB-SS&08ӌnb~3I߂1G9+fjYQ5 ӽI%}(+g NR(-mL[i.̿~˺3uAqϱN`NaИJX^C:p e"R9[%_UlYMͺ!&FYjtJ*șr͗w!WEY22焆=|i `wR&Am,uycQ\3a8 Ĉ$E!ea Um,9uA7JhC¡JšJ!Nh3!8j4c7$GQw6!%K;Q @9ltJ E" ؐ~G~K;P-&T #q^\ڒRm+ץwMCR k Lqe) BNi _nħAX[-p:dRefE#?;0M(   :O=K;whmw { ;w:-qΒ)sʗr9YOiBOs}2 #ac;Kـ]_uPaSF uvl*ycXE`\;j uQХo+"ּH!p[3+vno5,n/]KĄ| LpыF#F3E A\R8&qO]I]vP{$׾H!Na q .3IyBaц:{a QM3"'z3|+u # :]Eka 6W`{TRAN05seMS~KYSئo5$^2ۙS;*[>:lkYF*]c{=-+%u'"Ppi@D5a߶'eBw{6qbƜ.zaA\-qҐ±9Rk&h9_ThȋMx p .ICpKu J6T 'j$I!ޫמ+(F8Dp`o9>߾WJLBBlxЊ|1O,,ȾP2PZ JUƄXq\ʹ+/K!ǹ!Z\`GN)T:Wh@WdWJ̮4YG8 eEpLIz-0*( WԫhhBm448pU=&A>=A` ~*09\3cp7T>>ʰoEUą ̟=Rmw s I=2Jp1 V.vOiN VoHv9ûE߇[t;.Ѡ=`Xp;?2=]c ΍e Y M Afz&غRϯ H{qNuh_YFsoD + x@M-`+nax{Q;۪Y޽{`l|_՟܏}S @@j9A1B#ĿMmK p>,q}DOǙ)Xϕ]TͦwM/?a sH}p 3$~z;'4ꗅ.o~ߵʹ/j'˰ūt7"o5q\݀M7ӈKVٻV~ͺKPl.r=gS4flN)WՌr~~e]c<vq>no:I/kzڮl0+VjXǻy:5沽T$8S;Z4<+`,`5\lN&1BovBYV+UWa@C7U5_^]^`2Kh?+f[!|&^3>kih/B+T5q7 {cHQLv|﷘WHɄkd;L,>L&R2eFֽӯR־O"@F"w\'Q@E1w3S"ZXA| LΛu;˧_^pk ZI0kX#bN51KD1\vNi0,PzL_`NȗsSNuc$t{}|8q 0aZ,>y)?B$AU>b*7 muCv";#i7C9)r~NhykF | ~ \^W2gVuAI^#=!^}qk43Sp2wl7gNeK~N:p'w}-"#{ߺ ScsE-ZAbֳ?HsǦDѳf7klRl=Lp3|N6#e8w ­XWtSIͺ2J/̒%,YqWXR24d'Ԋ ټ_VLLt=X9$QgVWUYu}䏐U…PƄ-@kpQ6%'\.J꜂S%f%3YueYuܲNѲXgb2ށ4%l9z=$Պ0t?va2'n#t+ D[M]⻜NfII装I!xcErJ5|#:t@ ] GBZ81} : + r YHwI^$G>bƜ+|HOp Fԙkl>J;9}9"Ii(\o{ so0R+%L>s0uMч6oJ }l^cHߓauDA@  i)O>&6}-J4Օ@`PRvhN>>&Oa}O> SZc\̂UJB_Ƙ|!Kn4-numа@O߮*&}PB71aPC ,k*2=HzB"4)-vOT\tJ# )K)-!}Hx#b$p:B:ZcR'$N>>&`=?{V1Ku/, CEfO_dZIGE{IJDQCvZƼLU|.U_9 Nsҹ`I{j<(j_zxdE#$Txx<) #+EjT'itKx% HlwlUt$KB(6RЯ:!Un`ZIX,XYqtt5A b$>ra>V&e}(|ϤSNQpˠ-jX)qUX!- \ȩAd ݿ}W<|bba,ka(f8$0:\ +gpTc"xX {Kے5UeɻQ=$0>9v.! dtQb%TQ{-aAHTξD^zxT~\g͈q;kP{n_\ 5Q yq t58Vw}_ݏo}z;Gh>Cߩ[֩8,Zh^iqȱ}=!W 7n(!BnwIAHfI0JAZ3D1rP5׵uԈ^%,d*J;ɜV7p *Nm,^WWٚ_ԭ,/va?nI7֙,HOS2xP8S✆!6W kŸ#>n[?-N[ow10*mبyvːJ5z{~L//]3{9m 4 \TeϦU>|~G}"rzy0E߷ͼweLwUDSoƨ :bT/Hb}idӨ,`I><ܹyi]{jկ.ۯXv/@?f&9s V7jܢ퀣?f6J_ @Yi QIʎ, L]}%,|>-q*/]bE;V}⚶ 37|xKT(#; _vi,*_{9; s@h2%ZLMD􌚷1#ef>3C@ǛgPwty=#vYȤeNUV^a+vAG (uK-FLD Z*x٪y 3Fr iwY@/K0OPI+[fZN]ỳ/H1 1Ft#Vl+R|M++X}Y~ t~;z?"?|Q yja&Fԛ:;}U,6O ʷ qc\+oc*RkhR}{?kglfj?JU7Nyͩmt3'*FpzڇFr : aLq9xʝN(Dϭ˱ ϞƀӃM0)Vfa$,4 Tq:c.aMCn ?PEۛN1 0EYP !KT`ocғCc#NwDzʓ~~fSᐮzв3-_s@Gcp*\'y0L; lc1ŇKAst4ʳY/*B@)^4 AT<[vf&uEkG5%Gvt U,m#lc0s#EGncz{$ϛz,#>Kks+IYT-.xAh gm]By,tD6FnqzCdfuY.BTMVLpP۰oM>NaVUcxc{LXRfsCy0s=KMg)vCm1OTp'FCY8#!\K9!XDh mnW|y1F,-lxԍT~k5 cRbh' )󠺥B*jn~J1"uSLOR) Vcc?irNR{-=IyF5ABs:~U)4JD*I">KmibBHh cS$)~p{ IKM˜yC,P]aN-8obp_siJʱAql,蘥[a[igy9apE.W$% A*B&ɸ ʧcݭ[t׳0%2o _m:.n- K{TW:#V2) s ^pYb˲OK~)p]ѕSr0$O1jΟVJ#ӖX4J}I8|w'I?s sv6AS6~ h&9) HiKXx%%6AN=ׂU"5 v>ڸ Ӡo}"J> @qX-J^jct}};ռ~@Gcp@0t4Sel MmB􏗷Fuf(V Z:Ve+ %$ j6,Ngf'{R7ˢpQ"ݵ }1U@2ƠU,e(B4F:|l~$1T·M"e`s"5 ĕ+e* j|WQ5x-:A@r!p~CLѸ?mn0gÀCL!d lmHWΓGy;ZEZE DF#/aQ9sluGabQDz@Gcp ADŎ .yjQԟ.W5` ah 7=]a[2fr%XRDYO e kyIyc 'JmF?@ d|@Gcp;+"Cd\VD-֑;4Ԏ;N:òT1̧Mɔ]ESț:JTҶLZS2BcCPKϖӫ|)5`0]7i}\(y4g[mQhyT@8dKRF^k&;8u& c00cFJT2$AMۑY(? 5918~02n-aFPF5# qJNGfGV%j ud ¡IqVfΪiƐ4C9\@q c(@ѻw5:GqjhYZOm?dPVb:18-GKǵ߸fq3Qw@Gcp] Sym1VW뺙RcƯPmQʼnk29*[P5tnNkڈ j1GƸȳތ75or9C2>T;ٍm&aNrT'1 u!ό{T%GdtYz:%źTHU8;{(ɅPx}.>yqCUnt@B#4fe@g918 LP,8&B1Ga W~yԇ}lgp'(hLCG`@h6YB"/i= ?ժL %q¥Ju 21n{Gccmq~׭cƐ$Mc >LX)$WA3nu4!/0%c9Pnw\z'[GgkStʀ~C(ezbs9N]:Ei&t4GkF.q 25PuD3A}̊@2"*p_/G_7}Hbja[Ԡ>}5Ʊݠ"A ˹< @";18u& 7#~ j`|,1.~ZnpwO (L<s!aB} +vdg}#n &Ư@&6tgĕF;i__hQ#B%+u&Ltd1hxc~*vz}_?/l'Wb8PNnoSU(]A+wN\ϖEv9֯_~c5={ 8><~Z Kfv/ӵR,!X)|Dby? /?ϕ'-r6Bhm"JK48&93%Vg6WrQC^kI )W㦇odJ&%ڦ;u:H ,8ZIf>`޼ g ]|UX _}~[-Le8+f%ыUqq1wӐm(]=l̆lp Qvyl淚ҔQa,6}_.ԮkkX\_g ^kcY$&j>+k9ͶzD(w^.i@YM%qƈ v8 A5oXcߧFHT;eA&Pt2G7P^D,.gR}@)(G9MJi4jSqjIjX2uT%:yZD]0Wˢ[3n;p-Ĺ:2J-TE@yTyp&-Ո!8֒<&ٛL+ ߳3f~oaa0T|FSKRZ6 9vq*ϼC {1yupMo-/z׽^GэS;%|iqӍҫ #z!UT H>HԔ1тFQ4`p(5EY;Y1zh)rA}W&|>lʸv2Ecٔ]{d[SgZeX~<+]ӘGAc,6ʌ ;6`#*W%! j@Cf 1ћSNָzhֿIO?&,1x~kK7ε\=[!W\2h1ǫSDhd}H#xT Ǟ}&?Wb{9oY^ /9(ۃqqdJS30(raPHr@"zd!eL~1&OG7^\۠~o1ψK4g.ޠt.s !>ER~a FA\GPmqX[VTG*s)Jv!ƘVHD@e^Q;:ӲksBit`d:s Q:z8hkl9X^E?خsOj\[k],];hWi+>uOiϲlAlAʜTqC t\&1TD=Dr#990IySӍ51G,AȽ#R"%sIEdݪ03,*͸Jb,A"A$ wSy tf&IIX[:w )x29&Ԃ蠃u$*acA _ 5 H(ibz'/[F07i1]xeX? Ҷr ,]m:zλzt_ ^b})EW}sw'a_gsu! %'1:O?[7SŪwN2iBsmvj )dՈVq'\L4HVgb6u4">a P` )SD&hQ J1uSοL$u<[ wG-6`rT{/@д d$VD,oh¤Ю ]fIM?/ٹYFlj=9ΧrKŤd(!/5%%e2vu rW5[sUS8&g[3j׳)1R?zbu|m^VNpsf~q>xZS+ʶBUY(~XU^mH;sP&){11ލ E Ef >J?[/kBW8gg՞t'Vc]Ok\A"`-}ct EE̒Ҙ . јR>5+>rqvz!.*tJ2Kabg~ hXU^L6+3׀FbhTpgE8sS^o_|כw?|xLԇ^nw:͙ +r!r9-p /ZP5jT9J={c/SJO3_EާrGkYcW"ӯӕΞFc^{9$g@غ|ZWb$zBZ@ ZbEcG>rւWn)d{NmOeT+ /(&ð60N\qg=7ZS 0 /TfS2.ȾĖz ;ǭ oߺ=mfSSwZ~Ŭ͑poVΤS2IIC}?)6n+(&Ks +bK3.G 0(.FOyUH6gڐXJ (F~h!0y1#cOqDC9XN,ʁbeuJZ8t-nQ&;Mt: u9n:0Ƙ #uDnx;M*LM܇A/WT ͳ;-NP?rØ)) A@΢<t΁`fg` !ZO?iWs>VoY93|n;mR](Ŗݺ*On]@b@1r# hLNfi;p,CdHAF=iR;t)ڻLZuĀk<-/q8_I>S6|?޼zv;^]Wfe LFǘ`&yIg-5jWiX!>NSjXb247;B1Dh6WHtHA ,S#É76`e{ڽƘca"{&CD{N-RR%'LLDK9 Q Ƒ mTvDOwi6.澺kCrQvyl%{Ҁ-C,vq^[zm~->ˢ.R5,j(^s[| u 6[NX`;pxKg?WfdӇXIv E!'U1R7.iWa=SԪv^bJe7& se&;UKPu>Ok4e*oԂׁIs $\+3HV#RBu.ՊcKhp$8֐v#5IOp*ePQmuq^YQb4l-&hyonPo+4ӱeӛ SHJRJ8-+#L3Qs2,wպ~P^|Y*~7R[63}"J,VNPR5Ip(սC8퐫id@.mԛap 0 ŔOzµ Q D4iXor{MFW(ln_ӓ1_e/Gˉk~szNUF#AHRDqntQF)S YkYummK+2@{4wvQ0u+ lY ==4ToɁ6zrz3toR5T\Jʳi*qme"UTBYrJH2jNY -U{jF!Yˑ PO `Z˗'_ϤU#!2uw0jh?pc ӂ%&炐{\ܖnPe*÷t [$ tzr֓T s6@\Fn. !t|9(Lݾh)*&`l+%l񵎯=VBGP eo #TgivJ;Ƴ&s')pRdʼnIC5Hq#1 >!Eڎtg!ƽ"l\Gs8^[2.^^4R"O?~>߲vTgŞ~[jxOX@'@PF9{jY qoFrT&LߒӸ(f\bH(r28 vqkOV.ُ';ἜfBH!q[у8l :HR*M"X{)C-u'*/[mxh2 fgW;y۠?v>9{ < a60ceJ8%f&놀IL(J>&5+aY!, X Kx8)He="E^aBxdQ[[aX:u{A3@ QZ ARpjҔ-X@ryQEE5=Y&t9/J`lâeiV#lx$=Bx]FV,y,iCwI5[Mٝz!LJ@leXE}:ې]."vL-Oۚܘ[-W ST$OuF 1>jdD%,:*DYVUs LjMm19we; 24'cۧ`JUgMj|AFgn+|H{4(!-,Bf9*X`Z˜N$D@( :Hcx`KT~p5+u~e!ȃgZE L] +!MiNZ-!*.30c"(*x ,sбZe`ЁX1<'JxK 4^Rn#zO&:deU1`ߙf;k?Դ d49BҨ㖰-PچBޠEK*,"ǪE61B\J!G Xiqy 2L-Y5N.?%N{i,jV OqN&UT욕yo{\?wnkS VCSC{?:'so%a %N(`(ݔ%B9a'L+)'v2nps2 8S;V]%o02I퉆췾2eZ瘦6!ik]ڍO/x}{:sB@Z`Uq }aٸNhȢay{F5sNfO{v[/wW.4蟝7=m.-W{4EgAȦXKT Xarc1˦ !pT'܋y 7ܖٿOhfc<\꼑JVrEK4?>j{N㔞ި!XR`Қ կs׫C ӻ/?~/˧_<,Ɇ*H`;WT@ Cn_aELxњujSjע\js- #A}5V߿iMc({zF].==I vz$!$va^E4EK 9D 9 P삤C6̽f{#O%o(C#@j&2!RDRмi=x0R7nSȦQ,f/$SvZh^/ZHf9 ;4ۿxP85 oTx%,qU$OTjz<33CL{5>8>BoL5*J1ȅh`MrAfI ޵>>imw5_8nt2Lɻom$PO]cV FC#o~_14sv<q?~7yЮ-y)e0:&#[<LRwϸBQ9*.AV* @tk맏5:* k%Q9gYxJš?b`4*BGD0G9[!Z9*jчfJGbv0YRp*ͩޖV-UǏfzLzʺ89/4K#.I[wTyEyR!t)%H.7&9u>Oo'A^d2ۥ|Ϛ#a~~H)k?Xq')KZ4~EUHzΒ57{ٔn'-v?2#=^}>.@}@fZS 2LljW/]>My*l6)XZZ`ުkt"g4`{P,zi)1#"᧠_4ঃq)UY L&40"pxlX͜ i¨1XYbټ/п/gXR袨vU߀ >tfݼ곪괲ѝ?}"Sgs2qV8LUsVvj֞x42v1Sp`"{&CD{fhp8%b"XZYh6|v&"m~%[F}YfmYB;/_?1#_^],,ڋ=ċ`'#~1o |ZW  lo 2ۻ~6[L1r=E]\#VXPN3i:7g=&^6݂C&`A&.^Uq&jߓ6/^CeLY_Wiڪr\ӗcNjZoYk- oī#r.^vaR+VbS4z0&=g}nk~~ޱ!KݧYuVm>| Y׷!:(={#<'}(:efI>7LD]Wк]=,r~AA)]2?`TOhy+X5{dעk [7Oya}RBO@I_G_ZnBw9,[>eK{d':A]oᳫ^ 2򆻇}}yy?W{9EQ AaMwEg?;I /CLcڵ.{p"+yþb]Np1Fr NeOD6[XKc Tu]wYRi+˜RyLbR%c6(SZ8ShI.S:nVb&P,&`k U cJ#68R$#Z{1[,} *Ky>6/{5=7ӥ$[Ks'.Q])@/jx.Yn|g+yFvC5& dF@%Z>Xc̱r> @lKjߖR2|%gSӄ@l5/Ou02^;׶KQmPk:uֶ w+͈EI=`!b=@͊X Cj0j6 r+р5#A=']BZMƚi%*4]3ҍ'+HEz$,gղϿ2Z۫4MSd&#QHY5aj3jX$"﵌FMUTKN=v/Ĝ]|fHjXF|0ar{+%R >nN;h>̹h线,*<ɃUi,賆AZ$}~*;{%ZW>RL@SNq飶ěU`̙ ktݣٲ&[ Mfg38AM /lu@%G J^B:`^e.DBlY]IXBg `*lu4H&$ >r BXe{E2FHH5pFBb%#1 )DXҠ8KFbu kqZfI˒%OI;i A(UT*Uu.]RvWw 87 vKZ|3w>|3w>׈x@Łj)t~]J3J%RЈKf*5jeΠFܟ *v<&%$eX(' F{|gPPքR`bmK4*5AR鄡vq_c9Y-W9]FsNEL}(Uӽ"aKī8Ei!$gA+ H4TaU8GX? SMq)M8K75)Ͼ?+&a(6}0ӟahvkf-e{6zŝ/VOCk:*̋7V!()l"m&šIW BԨ imw1_ FZ~o{v7|3KwZ_ Zmyg$ u՛T1j}7z<*/@0D 5VQ2Yi%T861A!w٩M씤J˩5.0zEN<%`eR/ Nxd!RԔ1тFQ4`pH,xbEg;.y>OϋN&Nzu6Ñ/M킕Ӊ;wh5Z`(3*h2:ڀmT 5 ѡNn1'x[,TOw8gi͔&ҹ@>6-ap]$uqPeTX*SjȜ@bc~??h>a]Oön@VՆY>[Un:(ǠUX$G,M@aȳ E9 J"$4s%2t"rE&y$Qo HRYJy>% (ᶢgΑْw5lu.݂A>Si$(_gmCLJow\mz!1Sehl1ʜTqC d n {O<#BdD>0#&AzGd tA7)+YBkrHiqWDV٧fEwZI%HD0 AP$@2AJ}b(`ܫ0ZʭMa a=w )xS@{jAt:0\H %@@tkE,?O5#H p( ^R)D5 74F8k#bY$0 CQ MJACPA1k0BQFL9ϕ 2ܸB8u]eg)5&ƅ/y?C(l&7݅`o[* X F&?3Z]ԯ~\uR=fqu`vw0Z)ʖ6_8\})KkNb4z3oя l=]uCZфv*@d}+LTLx4m=[9LNS+A{m&k\A .{SHHeHĺzrUIPy0* B|z3J+坏Ԗ[^qk$\f 54f{Md^y[&.;=ًˋo6M>xz(:qA7OmPN3{ \4]HF9Y"|v+!K֘K8r?դOCZӕWsyꮻt1nE 1A"ͧ8d9p4ݔ]j~m_ PJ 5"O 7 X7?7 btlkv5ݬht ywD]P{q.ܶ)ozI G#֥"B46ic&n4dSՇV𻼩a[+Ik綹Ͳga3(:mE_LFN,WJzQ0yus~m=a0o2{.cY'o=W}oK^E!^ׂ*.v$woWwޭ2s p}$Xc' !p}Vy= _%bUTѨ}J'c*JtT;c{EŒa}ꛖV9#Yo Z?F*2~@: .,Xhۗ2&/}Qzvi  8Ϣ7uiE2kacӳ 䧫ӋY=EYersrޛjmd]@;["i!94C毻%kM͢-vwLo>Vw5'+Ƴ=}ms.`gA<ᑇ Njԭ[Զ0SM ݄16ѳlGˎ-qºNYb J}=}~hb/wۼʋAQx[h+/RthV s3XFmͨ=s*}́ݐKF YHBP [ɘB f8CZӰ#x!"Aa{gz,ʸ XۗZ38NZA`Y0Ƒڬ?b`4JK@9ʉO$J:Hk&)JC29Ѹ1)P/X8mN2ezJTj~h_iENN]^n(Rr sœQCįYQLR)ʬ]x%M~6 /ʙJ#M^%\4 XgFGi/Y.ViY,W =xC(2rcRPB,g cW]Dmxa9+6rvz:aw0j8ڶpN%QT8o')2*lt&P%$VÆƬL2(-^J:FqDDL6*q%I* UPQ'~ZS- G7Md` $J4VP::^qQ'U O!*bLzBߎ k*-zh'V#iJF6c>ɉ,_6T̘*(OX[U%!z㽗j3CzO6x*Ficp@ : "e`T<j[#a$Ų6d>wt7OL@W{N,Uw'ohnxK:sMP_߻WhkjtZ]Onu[ k9&N]ucq<a_2 ۝b [tK2+j{%r,z^irœ-χ}S} 8a2^I_p$7+qdM9l5>"Nr+O^^ dU R(M4qVd.ol*Ħ$ۯ7öpp`@[`:7M26?C[mh?Z)s{t }=W .|0߾i@w*0] VΟ]hMH)R񀃔(%aL0KA:JWI^ '!$+6mH'%'d].vwC@!SCg?;ܸ5+$xƂ6=Ad%NJƂrH]1܁!AGTH\s A+RF P_*HTHK)dR5)&QYcJ8ϴ3(DUds]Y6AB0㢻[64;fvTc㦤ҳN]O&AdJ&="8bbPkq V|PJxD)qpiO:༷R FxɥhPMrJ謘2L%|\YReVN-PQ։:MD4P#U,T"Oj+f!pB$p MK34s&f"ڢ}:I IJ#').DS]uV匦 BT&5Ep6`wxP(<' % 2Ct0!:GA ` B(q)Ⱦ(, 2!HK.D2K{FQ^z8?>M֍L#P‹tg0E#B FG-La `1'V`<FF8Iޟ5LFs!*H| (HT_"QPTDHWSS"KF7Z[Z^$Q"{Fhǵ {@G[Z%8AzK0qF5T0~D>{0Gg=`4k JX@bZ ACұ$/t`:hdD9;88z.tZC cPJx!Ԫ!ABJK$(q @T9Q@hӝ>Evhف+ϕpTxE<Ƨ@n9P A/h{JZE p䀠oTwf6hcv]#/ӂ>z8l A{6e 2{c^W|1̕ +xuE`:;|J{Y[?=gEigI׹g||x="V8Q!rT"&H2lGpM|@)E/[y׎(\.VQGyCdCOݻ NݟhgLǽOrZb&} f[.;i)WU/חrM( ƓAN/e+!9 wutۖY_Ǔ-IaZV9j~# \\1Lm6=ی8Ah8 's5F ly_l_yG0DYw$N s!036cl  8:_pV~^"ur)\ІH\h?{WH6 桷gz^ F-Xܖ_aIVJ%; (WYI/8l 8т/NyU{#U,v'Nڧ6n_]sZ8ܚ- haĭnl;")QWǣALhz\X!xUVZ_@ 'UUT`m#E=C KĀ$hLQ$*&tTe\̱BPXP74IK_8!m؆6t< N"aX+:`>V K\uι!Z$Ր z~V)J")FAV)K>&GX\X-ɡL#Ӂs֜Xoh5,)rW2 yO!{d9:T7FZ:Aok̉ zI`ϟ~Ĭw:zIsc?y J-UF*+b iJ7RO S^ٿ=ԣ7GzY* croNx>=KpШ,dx~5E3p X\so *Ya20e crRWJG7H})A,JehHm{b襄c` ʞ(6wΎ\אnˮPֻۃWih eJO]b.4eF=-&6s+vs.^R6y3j# jN,+ @dBbRD4V$ȶ`( 5 Ab(@0: c@P2:rTF X4BDr%VE^ZHpAu6n4t`].cLX4AAKzZ-lQ;І b3Ѣ`waP[.@RDFH T xIFFOtL$鐻JYHKsR%ɦwlT2&)aB} ZGH+dLW(3Dtde\E3WYH%wẜkiߺtɟPlc <p72kNVf \G祙/w冑uMJq\>L3#z?8M/oUGM1:H@IJ1.8egFKe8sqg{ZoVgQUsK*= w N~-2u8qo{f:u޻u2R__gMסŏ**c++gaB:hfݥh*1>ѨIy[oiDҽIޣЯQݿ'O~p5"1{(ռ37[˫J4 Dh |jiWXSK[ڵ-)XMS3+EQ pTO4e.>&zfapVVlM6E\p2BJ@](U?So{e8g7\CT W: :_jo? PN}2~ƽrJ/PaWz4GyYWy+6 Udq/o$%_ӏ?|~+O?>z/)Æ.7woV x?khijo4ۛ6R9Юmݿ| 7^BPV.buԆHxo:\5®/~Q?Z/s{T*IT"`܇rv>%, 5YtIJLpBVcC"#)I>m.U ?a"AT̤ohQY8T289@ J&җiTrRJs& /Y1J}7zv}='i~SVBͅz]}89Rwt5a LN0Wuups91ӺJwY}Z1p[F!yRnܣb:0m0'G.˚w\0+u\3:PbLԲ15WNJuE[j۴wpf˼{'[mX WA*wLxҽ al'.:q&7X=Ќܧ2 3&)sτ-;y3ؘthZp] [&h'Qx5DLnXe{;:-ͪxK6҉6IǾ^˾&f7]v('ƢݍI gGrY73K #1F,% &E9djս-<ܝ B&mB% lЫ .o6Zu~+Bnzݙڿ=-cXP[J[ƒ$ T*k-#Ȼkg|6{|t9`N$8 SW֏O,L %sRZ:C|"J3&h4F)B,:Ξz e>eY!HrurHopmGo;ifYC)/i% /.$hyCnIVV˦<4Zr΢3^Zz  V)3& RŅ` ,iNCnk`nOfe$L V[JhHѻAP,\tdZQp?NOZɠvxJ2Q5i+'lMu[F4mTyh-,GuyךL|`M[ MI%( , _SQ `eh C[{'"ԷqXЋ ZׂZoG@H1dY%}hm0`̨&=2[|9HI\(d1'cǀ$-CCЇqX4"U]dnrD :cw26QDTb'w~twAX ݲ-XH )++TkKZEc_ HN!_9y`H2wu_C8 $Iǧ8BNC̍K,U6$C:R;Z cr"ݮA 0xjB()GjgE/%$khL{Tm{е՝๜S8AF eF@ѠVd2V. I!FȄ0Ť0HgZL0wDIY0xZa3sN;yK A1 3͎+2ZDKGѹ(4,=}sƭ ]? VhBKhZd"cqY6 9 C'ښv"2BRQ%p?{WFJ_z>n` 흇]5E,Jz1}#UEHZJ\t<"+/)ArK "@(ňbHy! "0% H5 AŬN[Ḟ8B1@c/ u}Ow9#u~n[ ;99_܉+!ml{_ZX"#hIr z G|Qieɏŝ5YtWu[ƫYE`v2Z)97_8v۝]$*Fio0!5#1zaH0aIq-`.&-> oڍnkc[dJ9B!i`xRחU(ƱHBb<{ SFԼ\ Rx{2\*v&o5$;aP"/|h<9t.YZTahvg$?/I?߽՛w0Q^w=#41ȣL@0 n?&ڡAoZRs͆o1912_qݹE"Ӄj.GA*u]woғZZk=&B 7΍$&Gغtw-&L$2R'@iMAԷ t9D8B`oh0M Sc8W@[+s9J}dysX6s G;>Ң# \/I$vcpp1M.Ui/pl8gh(RdKJYhLD79JC|3@sLG Em&#"(yX+\$L@͏o^&gܹ.+7bZB;{4r ?S.*z{zO ;F"Ny z Kp3"اUKI V)xX,-,xGI4 GrMR=,3|xuoV: p;.HX Z&1Cݲ`͢FW])|a2p=g̞1{c.Y=vg'f)Scib mWTI}J vaod&mC%n(;$)tKU|>4 \<[MuSMùo~RӦv<}m#53jIyޡ٨6[iKk3 5 w<%7v0j:6ia#3hcަRDUuI=͚Zu Xwʹ*-(4G_E_ε:4N+m*\( EU1o w_x1`-b>p!×#B8c`eN/,(fؙtsLp6-⮫E3(* %B# b!_T BJ@ukϲMgK>;qy.j!9,Q[2F!2%cɭ@ `*0RX{W/A3R5zd `ewh^CQzfQOz j鰷z;i)>cq {2P$ OJQ2 L@#cI\(zEpûl4BRwCEr&J9M%8r#N Z; !="![:F"e>*[S<) \Q1SiԚYO d@{s0V͡pG6 y(w!Kj2+zu (Rv:$&{r6e&~Tl!;͑[IE"% QJpZTzS잮fsXȱery -UO%" %kY3&e޲ ,8 XH`vg\<$tyGBU| Ȫ{ڿx?X=gb bj,)1/WIdxsR2cJ+Ԙ䓙mX/{x>HuH# =1 JgEļDRMn烕Y x$ dS띱d  k1hFs+cJfxm:[& ,GxЅ$fr3#^dW|-:ܐbvb3Fnw=}v^{›r,Mrf) Ϫ:i ϲ&HY댩[&ҹlnm~oH_r==L5nN4?!y|Cބh!3ǎ7-zs9ion;x5;G_z F:$}rmq 9㕯W?e"g4P% S 澷SCu{ VWULT1 '#c6 \8/ϕn?5n?yok} yEܫ@Fh Md`!R /FQm7 6nY|ީZ͞r)Lt<9^~*qRi`Pix`,(0p ʔV0b8_+e@^8\)! L^x(7IJb1T)b+t@Hmʽ9Hy]r^sfBU63F6-- 6]#׷]Ϭ1tM϶'('ќqQrўᄠ9:,nɍǣ0fGzW 8ãnfߢa&fʕKL?BVI1[zwՠ怫$QbrOgՠYl?O ժ(&I'AkK4*AaNj*8;oF R@ҡP>ѾT7z wi1ُ"e?4|sT1JA"M) HBEU|x8Ĕ*V-^Nn RtTI䤸CG2@[d݊[zݮOIqo ݩWw 7:)tƤHشZEQ1= pBx&Nᯐd|ipFH;`}$sD SpN.9 @%hH)EԒ)E4Ԅ1<Mq-3s *35.o-쯽ΪݱmGMl=.l>Q<̜=hK΍\=]> ;]}--q> 3T5@[XiZg=T~v6FGr:]o+{C󕽙/51}h]܎J jhvP5zT\J{sqO"?fX{^ /OT: r+р"Ds']˝C=p`,' P^ @Bh7")lN˯oaӓՖfF!v (ųۓ읅S=,cu[,?q! euDR(C2(؉GU&oL <* sJ4vYI(aaR& %ZII|TN9ťoW7&L A1D.)-N2]Uv82䳺\zg J3S^̒`)0} J\B:`^el.D!^ZqXJmU)_L1hL> 'HBBD^0GL{b=o\}f ur>8Mf\']ohX:s&ӓblSp'kmH@_vsH~0>9p:gxH;߯z$%q9He ıVwWwUqRELNJ5o+dzpհXeh|&fcr,kݴƕ7n{Ig;|?K8uKZrG.Pi],˶L]Gl]Gdk9R(>a7#ݝџ @>klu6`{%YPv<M]bgfX=Ŧxk7b/Aɪ0bEP5e`f"DDw/ٖѦnC"$īDH86!uR?6~-;mF/dpDL!;Ƕap%+#!tW >'rCmyS]?losp.`4+9XJAОKAo[/)I*n3ޱAҌOx>rl3\9UMJv],,$ fsf$,CsĂfm㇃Q~Dd~A%iс*ޚnxB)tjFב{?Oޝd%9+ ȳ7^ڜw"/cL~ͯ̑ ٟ/s5֙Q7Q!O}?3pG{ܷzJPKq ?.I$sk<?>Ӌ#N7!A³PyhfO3G=noO#g"fǧ&64De(xO9s%/_-yK[ҥ-L2 4VKRhm6L&SYeTx/4&h n"$Mῗy7/Ϫ۬n念Yiud?Xjdh3e{wie_aS-ryz.%)uPM;h]ւ.<;ܹwv;֛ۨJ!n e2-(0Q'x:*naŔ6Zl' Nfs"jB9I<;Hggmz8ޅxE0Sa`rr LE6=t@<]wM2X* FJrSs@| P53p )=L݈\osRY%$Tt=mzxx;$(!(C [Q ' ' y}¼>a^0O' y}>A0O-' sey}¼^4^}¼^zI y}¼W0?1a-8Ts+7 & .a_ui%7ih 5(V8d=-^GyLt)Pʍ"VFudXDcZ)"VD h$} LLj9+j$Z` `VQh%Phֳ\,>- ˡjٖ]Z4={@Pu@R2ބ(;xjQ9"gxAspsm"1^z~fg\HX{Z#\(ro4 +``T3Gn4.RHA臒BXY M)#"b8Aa$EV 1B;N5.L_v} [~J@VpsM$]WHQZ ^m;O~uK.PP&bpD 4!!e\Z Z)I82cKT4*kc&PfTetۨQ1,%! j@Cc l:%x%zrƆHwG+_0K4pu?Y5˗,)B5Zv&ׁc<̙U&;yŖǸ`Z?g-G- fs%*^(aUǘQףyl5XZRUX$G,M@lZ"ς.HՑjLJ"$4/2ܝgPp)LIX8F-`Ve)AµoAw:ϝMfTd)+Pi C"r:ASwjfpc_OgpޜΫ+aWiC(XЫ<=>ztWXJ>^& h"|ߏN5)Z8no&g IPzJ%67()6 =qWM/0,)?f(Zgedb S?o~(x{w{U}p OF+E|27`OWRX%#Q6~-\.K; 7 l#4 Ce,YF|Lnh^bv qQ Z?jMdӨ l:K40xZTmlthSWU2P-wKѾx {x2/NE Ω׿7RubVz<+֐*S-Iʯ37\.]op?{//Eӆ)pEv?v`t <IWCxpoZR%ՂW.Inoe_!|,jϑftP<@#lEQAZ>x27+IT"uv-&t&2R'@iMA ֆt ,p.BcEDjp&Ʃ1+{s9MEyGNufnr(-:㉭*[tZbV8>GEѣɑi^"Rwg%+xTZH SyW)S=/loooSexB.ROU f믣?g%S,K39q.9ɪ wV0o~8./ͯbߥqzyuٽw@0Ϸ];oF\oڍt?:n78ela#gk(|$ _^roImO?~h!rF `hn)1HW.Tgjw{ig17vDi<‘H4 )0FLh)NS,YQO0اlL.˂b"XYh6tv6)!E~nTy]%h~8?Go8׊MDzey8*P$ڂ@/4X?-\KFQm셷+TQS(R 闛lP๊Cr¯~B丌Tn_V%أ }F9u :K^D{%CKEW~4nǟ>AB}~ j:p^ldfZ@a~%ޒnد.L^p`C̲dnzPP6oΡXW6=“D|ԭBZf,妗x 7䡀5d]<% j$тIg^rse{\!`r@%A^XzxfXO,/߿XŵMzҕ5l݇O۶3.tqEޟrt=0'v~[EVt91 3.c{C0ǾҚ֖YKyJUtNRҰ@bxZĄ;6}DP2+٤|J|m'Twd*-]i]}mJoOPԬq؟vd86'O1,Ȩq0"Xx#A!h 1u2ޝwā.6P⍭Rwu=n¦?:x|^/j!ݻگI}\i~뮮q}N7{~pn|Jv 9wt +7‭iU"bK(F[ :aFh F*_**EEO_}zt_6['͋BRއ'e=} 4bzNRRx,|N%3uO~|wr*zۓ_t77&!kl pMl$X S#@'Ĉ=bL@>cΏ Q0<"OCl鮄M&##JH4X#P_B" )5H5 i'1j*EP$TqD)Sr%T@6zK3bzmPjRO V!"e t%$+ӌ #mCZ VSv\>gvċJA=U{x+"edswDgn55dXB=@j9AQĵ^XuBh4VSua$[E6ɭ&;8N&}U999 AHUV8 - <#&KsYKU+F 'ftIєg{*8@$t\ Mrk^sA-VgҋR!I7.НX62-VǁL(-&5E{9 E}>w!H9ux]P3*}gh\t%ȤAjP+̆Oj~FksvAe"DS"K,&*F+׫ r$)*㬲ܫwо#C̴u@i@/44PV"L,Hm t26iM)eU"|ՎsOVe!DL1IAs)mR03τ"[ >QDzcޭ NsCU=AŰ7o͊CH7dy\]v@$.Z`3!}nPrOax0<+?l<Љt:XHx;X$0o?/x? - !Vd>.F*5~?*qPjl䇂=;=nyv z^,go'M:l{>=Crj?uّ6>%4>m4v0LʛT7,%:s1h:_aC$FW0 6UH RX^; &.(*坾,VhL#Lr2yr2,sJV)p邰.E:kk3} 5Ƅ̽kȵIw-g54yI7vvkNIFCټsgcP|̉"0ZE+#DK?b"MDC\ d:c'W Ȥ oB_#\f: #R2+ӚjS{#&`YF` Unb9aP >Km̱,|6*je%y+#j9gC!)g2܇ƣ2M2:Yd^ͱ>i)DG]pȯpCwv@2k$D-I4%` GTJeI/IXO&4k!  +|Tx44և2%E1E̛O׫F&x;Ӡ$+4p\1c2> 7sG?6nFQO VY[@9{]87 iXګ?=Xj͌>S;MiT'T\5+7ƃxvn<؍ѓQɘlM6F6&d#QidMrdF6&d#ldlM6'5F6&d#l&d#ldlM6ɖlrH*Me=76ƦشVcjlZMAzP-iuno_"ک/%#'l31ij)CQ9 > .8L2H!('1y]^\JJfn)kJp18Qb`DepVzUS*j/;4XfBOot]snu#uPu<]>>~ q`ou:KIYd7g 3g>|7o.㝇.rҦȒ[Wˬ7fB. G}2Gs43j̏G; -&3qtxDLwk/z}%ØW%ڛޏh+ P\uϭ'ͭF&\l}|๧,A$t U@rxQr?%+NcZ>122Hp%TGEZ[JQ QZ%US!XaYZ02\j@YI]i\9 o?-g{:q2~'/=j8AU{ uh7c_M_n; +ЬhJ9{!B2$NYͣR3:Vz{'o9R=uCZM[#s miEd}-Z] _5ݒIR K&N%EPgӛH8g'\q*&U@ C2_^Ļ@o^,@3]G?-C&h`-J08 ) f|L.2xBTj_[t:)vy. n9Ewv!ŲOr;u4#p՗˺{5]Yb|m Wdz1h;;uҚ.Nz.ZO;g]6<)i-nPԴ8Ox+N*FLe%y+j9 gC!}-]@oI X]HK]f}uT?׫C\^щБ kw9-#Hh Y4I  HBGX$ H0#|l % otGY!YJؑ NSUa y jGCc}/(ÄZBn>_`S0v ݧA鈃4J x?.6a0x۸ɝ$L{gf4 7Ӯ3D՗/|c c5BW?=Xj͌>S;MiT'T0GwqnݰKZ]^Lg50Tr^Z"ʜ:J9ejtp(,nEǽ[; W-o㉚eBQ|0qPΗbɼG|\ޓ1?[gtL})G#(z{4iRQL2Rf:qHǘqQh78&Խ-VPdFϒJCBrS^F5\j-.\Ld5YtuNgQGO87j[)K% J`(uTxQ, 5NǸЄ'-꒷*s{j+.cGՓ F꘭hP8]I*2*ښrk(4kBYYBӅG/ZJ0'Ap=b_aq_}h*M|Ȁ+, E  jnQc3RA>FM;`M'LTC²T)x#;0LA$ A!J@!YFЃBՐo—Mƚ#aҀ:Ifx.9URLdB@ A Ԑ &iMҚS޴A(*v} og㮬stMZ?ƅђi kEZ{3CsC@gN Bѥ6Jr-ny1V٩R6Mo1FK7Te Y,!O+Uch^޶4t&5kYmWHj:֡3׻/Rd\f̰LQlQER0Sek09 k 7@5$IsSUe2{U2u%$+3f&pcZs.**Kp8d<垻ެ|?Kjv(s߯hxѲ-0|\G:uNʧC/jaV@)3u0I9c?T}R$e 1h1lZxIzN"sAM1eu6fR:9w6>ҿ?~&_BP80@}|8.d̺ >t^_?RrL4xOQ|$bё"#ǀLQ(l]9#sI?? Wj% AoN[[Y7pRus3'v]~UܹTCѬĉ8+VL^4G('sM:ZqN8^mgȺVkƊXp8G\Br/ÑϏb4C4YAx|jv~ ҝ&F=b*+ptU93S_{%#CNL/w}BTN~}w_tsNϿ~z?Lj ܅v| shY*/%tֱ$'N렐X]D/CZ)CXc=76tJ$AUI x&\-}4^8Dp${sGhƸ._\+[w<`Cq'`XY)@"%!CglHe&׶)x&n{k[/5?Płˮ[=~Ŗ7ii)4BqfG"Z%H'DB"|S .uHE*'tҙ211#U fh%Is郕Z) q@FL!LDH<0*ɍShV3X{0+o!Tt;Ԟ}t3_/]?x7ehB&U)zK^{\=C=UROӥVc]dD9Sӟp25gߑ%ﰂ :Ā -(WpǛ;4^r_opx]Y5:l;cP.eᯕ1V-~kfoND$J"X]B2q՛+xBzGWA<$Ks K80s B]gmNK3՛a >)>'\?R;Mn"gqqb Ts[Q))޾($ EC "Cd+t *PVb9 :, !%L`Cr |SZb5(V?uNeTNe>;Y.uFZm%jْ\u0rzReDX™f{ѳk]fO0gb~PJpU&894ESB GSֺvq-%> ֳץydU"ƠFu.B)*EX]S:N47CR*(gizZ󴨽yZBt4;X(BOu dF-"Kh”FKQz)%$@ ㈤fto4f KNJ[Hu He(Syqq8%yۄs2I8E[<`/jW?$,J%۹Vsr8`c1*p@M ]9C%щMڹEpj8yh.V|UlAk/0SB`8V hLTG pSh.rd =٘ ڰ\Ó*d| 8_-GPnrXUN5Y1YDŨr-wU;ʜ׷R٦u6mvt &Fl_}}ۛeK{\)gʛkyBcQ9@#!e/uPRR{M!$B[@D[x\7BrS) r<\1mNcPTqK6jZ36͚֝qr]x. B½E&7ٙ]ߥ*Usm'zCix 4v#Ez'=̦ pgYqN :80 QD&lgˋ6Z=0QBK!Ty(L!Xu!@03?"[5ybO#}IL1)jITQg4 "&I&IT*!i&ʒZҗFH_jxEZ2́XРܳhH>ʔ$C`ic:x6, %;(VIb=_&mG%%A{"3&kI=b&!,Ri P( &Q 0?vI@0@%. GEҨs"@mN^ 0RL 5+HOIIcWx"jӌxA8R"uGZ|8]f(聋Z znfvIT /3S\|wq=%!׷tU m0YR"`X'ȺVkƊXqGLIdz-ĺ Ϊ.PXϊ/7 y^bn;M2wT_)pE]^u!Nof/܅v通5iM$i77jrԳ ߢ]mrM+|-WR!r f# @%z~OCWZ/vu@[͍$&$V0_q:3MQ6rhjB6(DlrI[#鹽 sRV1y@6uv X4&te=pSC;TY*4jRK选Lrm"ʠsB\/&)TokyvbچK=D\EsVV4zWFekXύ$)].Vk*lFü>lnH@V&mڽ1}C-Cr9n5 UP%Lq@5ҡ6gr]+|?WS:v[営;uS|RQBp 6*ѢU*j`IX Z))A'm\FfR#h.g$E$$[ )zUzIfN蒑5rS^RI AAk,{IDMJtpW\kfWql:rZw*}})8WJnG*mM4p mk@@'rUzBKC%nߊ/zRpd)ĻM LȠbRtIV3fN^qP%5,/h۔Šѻh=lh W! %6]8(*L&HY`MICԋ6exMU~#מ}utO^qhkՂ`ytE_Lf"\[VT?Y+^^;1%,EiU$5pf!!dãU"SrHTkyx=u 0B6iqQb ),s3}6HC$+Tt{1)+:yXK'u&'~O>zyij%>S~2^z?-B&Z3Dd^@X=eO//^b!& c n~`>*$L }j \ sM9ltjLW)H:FoPE(1x4ͦs`&\7b9Mu퇶8F;&V>Ն;:.W}i`ﶜMMgYIbs.|ǟg5=˥ItnreC%B/]zh7WьFw_GݗM\t5f.H#d i=+%yvF̺zow;ag+ -.‹\#PZ{+\>r'_ưcNZ+}z7Ek.xrm3ü/Q >x)NXhPLuǍcbQOk-)SǁSǁб6ϵKOt%tQvASTzkRTxk1EMjSTX(tWjsX;)^xVIme Gai1)YZ~4~tdF=F[(g=QoG5Z) G"ME>_8~~}=o`TJ`nX=BՃĠw NnL[DݫŶS}zҖښ6_\>Fm W1xyL{.Im{M {g⿻2g}yNw,̔㑧>֏$<8O-o&*/0e_ߐ6] Kz݇ڟ)?B%Nʾ&EeS`7]u:! 3&2s`&lG>kXQ4=e0{v^g?} ңp.9\YD:$\(#B1qg5174Flfj/f5g>/ڳl'ZzŎxn2/zji+vsVJwS҇쒢lLlH +lCLl8f1 Ng9j_漕ml8iQD+^`-V aNAWdc!Y@XzvCD\QT$hCVg7u3!؛l: Fd//8`IR(dO#ĻOtU1Vb^cس3# ^-1 Ѿnآ嚮- M>[$V,([ ,9'>y A$%$I[d\u1dh Au46m;|p)}8r|~̃JO_)?#hp13W;RwߦEfm~n~݌w-Gu~nӼtҼt(wcjG5Q{6NJ{Ju&J圇cw9 8ð ])F6rKقٹ|DhiR#8Yw<ېr.P.­q r6KhCb,dlAƤguq~֢T '2DJ DZ9~I[d1%D7FÅD7+׮;ԽoM,xy+&h E9IAH9j:8[KP[m$ՓS!X|4[tPQA jMgfR΢ Isun ׅG/J}-)/P86M>p|7>}56W1bzY ]y1Ye:1@6<(:D+ʹVNe_@URɪ6@4Vz=d3xSR.R%T)q)qd)qҖݦT:1*ZJW=kx!Ft1[6t[B*CR>dEXSq|a IDs-XJ[ͦ?mjKJ3sm}7x(LYpo=9;mϐF*eU*z™R*".VI؞|Gi[7sp}6lb'mpe8ͶgZgbZGcZx,kT?Z$ޘ&B3 E{uH]2zX WGz \+dw9)0hkx$3a(t0Zy*s6\{{4k{{҃9^Sjiك/-mTLTOi~GB]ePAJ6 P hL}(}% ¥ToUR\sij\)HDD{%`R1:*;NsTH2G: 6=vd*4tG] %\oZ@"72,r pصt,0!ϓ.Hi){>EM[Cuq>栎Gէ^YmxvD[F:dFQIMBb* Mh=`dF |2@68?65`a0G=דFUXmڷSU7i!"!Z=F[ ;馺=t ]szv0OZ^|~uysْ[T'/ɵ4PVȖWá6-e9$qu8z(S4Di5r⹅cA.rlQ|戒d>`iR 6{&Kk*w'md PBXBD&JQ"ٟt{*;])haup؃XĤv&?{WƑJ_&>O1 E  eoVI-@bR"PWWf}U&x = :`P$*a J舱׊ W,ȕT[SAbF`0/8r %28HnA$PIAV1"HywP iRhV Y*7 i\˰p?n-ָHu61.[W'0.c^ >_~kvթ1Y@S3o>wy[JV ~řFɆ/peCQWLOa@rC}SD&bll6xunu*ҽP`Wiay:XUasw5[&P<s+398 4j)ѼWO5/.VۏjŤd(̟>{K()Zl}SU7Qid՝Et|?6,C;*``x\"?ǟJ{zװ~e\boW]c: }D47oZ\MC{]ӰoZRshWC_|-uWQ!O1َVLM|͹``_ &>zL>H|\Nƍ$&GغZq:A"LT- PZbSFb1i/q4‰R`oh0@p8r˛[lm`(/@^b9r3 ɩD϶&5GԲnzslVA'\URE88ݻۑed?=o`@~^=pRH#g&Zd2ƈ9t-1u\+mv" h0zF}3F[!(t\TpJ(QCso|y3-3w2Kˢs|}͞ԙŵ잪]F:k =UOLػN^ṫ큔t a:G=. l*52v>yUx-QZ!^_tcx5>kWF;V z|4wE7W ee4reYr4[q_qV1l9"\^dSW0%H+&胋̔0hE5?{O6aw\7ݵ寉!^^zsyV^t%iI;"ڙck߃afց]=mWx[ Yc/d>`&` Dtc%>;4E\1NϿ,y|/w"}ȴvI,.q73zoؘ&4l/C, Q~$Sas=B6ؘtbi ԴS1ocb(5Dvk`[mû:}ψ/ wF[seU6ֈq{{ҽض| lw Y =˺H:f*BB3A7,Z1mY}wNJu꫶ܰy`w ӿv:.t[aU)i{.uL-]1|} 3:;rwL V#w>֏ݺ[:s#[{f[]ju۶LPhy=/.+2콖2[,6`gD^;yf:u(Z-f7%8P^  o&I7 'I!c XFrgr ,^8˞^Pfqu9/iZN112F0T9d _MW,X L31X +7‭i4%{gqoR0#4 Ala#5zdjҁbxtbȒ˒["hA](b ^}&*4;^I}U A墦\9Eje5; OA)_|vF3.j4#-=5IOp*0GXLIOHa˛*^Z鄒1&Ez@i6$54!13 NRɔ*>Ww#ʤCON.A5/=B~H~j^*LMrK#4#[z/7W6ŵT1FZ`R0 X)(! lEt &;V>,\\FGF'#sk ƈԔFmo 8XV9AdXGiBV93(* %B# 9`RZ`H' p.`HETB[,[wzY#W_^u 5q@Bb|iQ,8= @ F!)S!5H%@Ny)U=3ǖ cX + (eDHD 11gP @lP͇N*vRxOt(S&l〝I$X[ (d, UQYIͲfXNٲGA6!!M 6eNJqGG\:9:w^PNi]OCg"|ٖȡ3 _w'NQT(hBH唏)ey b8Z3its 0h|cOnOE:CCd-:Agicufxޙ|X"kO{<.=iDbtB6I6=&0Xe'lNR6I}}[ Z EjXD)r+o08=9b ;!b ; ^4-U$B" %kY3&e޲ }GYq.‘` Y-"L@iG EέRȒ>1rREb?D1lmACw4 Z;x{SEUHOp|| b#BL4apq.H) 0͋|6zUuSBq|>A+ؖΣI6)|:($ 뺰 뺰 뺰 uº.º.º.º.b'ua]ua]uº.º.º.º.de 뺰 뺰 뺰 뺰 1$\fWCx|&Zdx akɌYtVZcfOfa^xYK:A a@H()CsoIVf2(\LwZ) 00@`e4zl5ͭHĔ2[wekUKv2l{mi3]wg/_$͛uomK ;{T'"9wYzc3vSu z<7unu{ýɻurgf)L㑭'&9`U/q1aݹػߩt(1{[gAjs^}PʓyJ^xG} uc͏GRTMw?x;˥ +d{о%tyGϟ>oC2-b?m.bhiuz@\JYFF/7kmjɐݍ NaTWoo~ .:SJ09ǫo>n}O?lEрmP5 ֖S KzMC ۃ\On?zdrrM 9 VEDEIEH! -w,ϖ>Y;rЉ 9l:q 'R/=},P\~j6"mE:&aS=j0gLKnsO'޸k}HnBQa?5B3$#Pjv9Lxl{-fWhfu ]|ԀW 5Zp/'38ϛ"4դy ߙ_OG_@՛ju$wmI_!eo]m8l'upgQBRW=|8E6EQp1Uտzwh;I#-|";ӳ? ?~yӒz"'2Z0+a݈רg"^Ѡg;42TM^U҉ Mu.NKԵԻ'A$˝MO✻JX*cC2)xTNoVP־ov eiiHTN:rJ2Em>A0qB`E@șƝ_%rNQH+ SQQs8.5yAZ☋kN՝esݯ%^w'Si8P?UY5o o5=4z2p8> 3?f z7漾dgi"`UCM~_:XA)O(-d1q .DxІ e`QFScϖ ]*.y7sl>Qͥ'fVvƳGd\]tOǽhFO+oO@b6'as=>Yf,tږq7E:Ik1I%x+K\c0P]㎎<53B*<Ǣ4e(f\*r28dv,φ7宮\N?G\)oOoj"4lPB,V<8BH0`yC t !9GrmnyXPDtH1mdNR)lB!UZ2q=J9,,FBYXV<{| ,7l<zOnl7u?%6*WC Q"D5WTS u2IifyE%6nCl`;8b+pJL =dbDaT1.6%Âry,RuԖnl QB΁WmTy(B$^)t(>u!!{xwQ#%lT/$XcK?J(&)MyЂH.$xޣĄVSsߥ8GD֒'dw`N*H`49"Zdz܄ΞL\' Vdaωq1r&OsoEӻ;۳Mu||z"0{8|2[zݳrs FNz\'Com/BFҺq$!Wt4 k08"|B ?`|F^vkgi4jZ GeoUqHX"4Q R' SzB;;kCCopYl[[.;I2T'̾= /k!SfpVYBJcnqwΑI"ˇw?~??_~O_(/ݗO97|/sU)H`;O៻L@  6ڰRCx$|hUUO7|q)qZW /C-B}Ύ\4ڙM6٫y$**ՏvjB[JCH|]+c(c9h45!+`"6@0B[%鹭 3R1Yf;.  5S|71X+%H[:Ꜧރ 5 %N"NT&A)<VeuyG1(ni98njc @3;;K5}iBMU@U k\e]T$ A"ԑG43"y%),I43md! /x9j:5QRyLO4)olPV#JhaWl:vWW^iAK#p!.kgV6gsK  4%*jrQo!H)G) ħ 1i( z').1J "Srl>[PMUCH14Ǵ҄'VPl:ée)tn\څW#̫%Gִ~;CWrjlρ[YW߂a%)\@"e'PmO#<Bhz!˯m6|QzkF.a'g7|a}#iiR i!Î8CYҴY }g?ֳX&?Orgiϫ{ n"oab{'^ZZShέt=x9WwqSfe *ԻU% W$pM&bA$P#J(vVge!CC'E 4ҹϢ0!DiC(vep@P DL ѦS2,ZB 6SZl:F p|H5Tz|>gz=ԫ%R6Q 4d[dbjwem-uHX0q&[yL*/Ѐ2EjDͯOp%$PlP$_7%9ji3JgҖWL flN'ǹvs$xzC!(lN*jIrd}Z-o@&YTww6Ŝe],#YԽ:|7:eCN\#dxr缵S]]#d$Ͱ|T*t*X* R#BJ o81%< .% mi h*:Xđ{}JmOPiK)opz X] tuwVrssl80m1YҠ B7I"-,@2QE2 >9y߿ [Wqt]DS<+w:eBU7l R RsJ a}~l C]8g8^HBIR~Rfp нeH tظgSnWHx٤S* lܥ }$.2Ct&sL[zEWGM[gq`V|:PR_ωsR#[d)L' \fF[)seEUF2qqMB]L*[I/0yyLHnQ:( ưyǿVwvhp;zi MWe_}FNUK_m<)]h48Ϯnz |C9,Z|mКӷڽU.>QS9cLK񨪕wǡE%ɶdrƼ(>h[ٍ^Gυi OFQ[s@Զ4m|AEKMB͵+TmX;{ފR BUYBӅՅͮ.?vgL4ɟh42Ng߸fcILlb1H W!cWȼd|0˔\2MC< &C(ImJmph s6u2 p 1erVwvk0qy*Z8nvM ȑi.s<Tg)Yv4 t2[dv^rսI&1XŁ,dBȊZRpL{d0QMfgպ[F}}5ThjQWֈiĦk(+gɤlM $wY.4ld 0p]MҚj ;PhuMotѫWqTwWcD-HdiFocql4Ʊ86 cf5Fh7Fh iii4Fhcql4Ʊ B=ǬҼql cql4Ʊ86Fhcql4fY>ٲ~åY%uFHUwf $ApVzꔭ'VVK&d>c3iylVdBH98s1Yi,'Eeӎݗ0wUL&x e ą`(ā YmR%I(ghdL;Be\;{u~2y_Ir^z2$u4Tћ@767Qʦ\2<}.w&7<;5k|WUcZc5(/H:"Rg(PL&QjdD2X@CPkkzZ)%]a$!(N%3'I%gRrt.l:#xqF[|jVn昣 GR Mw }]y 2e tsn5xL84C_z?t3K/߼9]\ H ,^} Q ,3a3J]>dz w;#Z̋#ڻ5go]f_wY+aՙ pG3+̮9YjjGi`lSKll -]ljFll ,O4e.>n=msx1mncmou9Mn+aUrȑRV|>iL(s~)CJ{?+5W+׿0ƿ ?Yf INm,1pꢷ|IJ?_ ؖjZq8<ˏ$$"Ƿ/?~=ۿ}O?>'z.K.(+݅=3:UxM;Qijo4仛6Rbh׸C V UAtG6*ǾM>I9Ⱦs~}>;~soIL"xwte$Mp NbFb3^۰.]/>aN ǒ@iWtl%W*H[B!ccdX$/T9ԙdȭ&!d*ͪ'yΉG4N7F4t dzW/9n:Lҭ[TxJM3*9gBYtJ ̊#p&}MwoDн]7ɣ9C <ɢr6A \6AY*8 "=pt7N"d-SYN5{1mQ+]̊Vˑ^c"W[bH>qDU#w|qOhk|܁5݁vOw=c>=Z9_0禦s3+y:yPY&=D8-NKsp0"scHr,XbLVRHa%MQgDLTB휴2,w)a.FH\dL2#9x'SA teH*X@g6|YVΒcz _u=:Nnt?bwUZ?u\0f*Q~t7/ztG2;x+\2_»+/Zbf{ԼVrz(l6׷tv;}d3 W2|GŋT,ɯ?um[ \Yt_jLw̋R%Mk•[n6f-C9:YLd[m.H;͙0*9rM=&4=&=& l*$. 3h-u!Deb8hg Ơ\jrZE,0CH pܡ1R(d/k,pC=QXZwv&Oɶ j+c"U'|hϻW/^Mŧ?˥h@ۊ!-ɣAsRhdvZ㹦%o4" $W^7^EvZM{|7\)=j_V$FGǬ=crR';@o:8$ 3I$Isq_}uɇE!ޣ 7˗HhGF^a6ɧ°w0~3qX=~9R}+Tv v9%sr,Ӱg/Y,n3}]Y`Ü Cߪ&7MnNRnyܔ }o>.OpX_~nn]i-*;dsi5| w$ f =˺I:?`5xaz;~-ݘ6,}wNJwuQ[jm< ; ޿v:UNIs7M%>ݜftz>o)bX?qnw_<p^:(x.-n,lq'lUo-ReTǑвGPim.Nz*Z;GoxnCҚAQvp88ij[UΉ`gX&F1F+&,DPe4Yp*eLIeٯ6en?/B"l:y]uҗY %jP{:TtW7>^#yu(B/hvG٫F`ϻP4Pآsp+Ʌ79s ,xf-$Ӛ[o b]!6Zm*-5yRwv0j+j'˧t'S? HcZ=>[X u`_:^2j;h `EALëG«Z+u*~7+nnA)fvmXWrtI ?&yQ$niP/?yآ#th4B; ZYrGR'NeNnQl[tD^$$nH=Em,2K˧ZzuL0Q[cxҠsH\HE 0' &(\@M%T;tѡij`,2 1k ϠW8!g~pEeS|튖3&HА),V6%3R FW2EIn5rhla3$rv3wmHwC.`8lf_.HV"K^K-K~e+6m+I#K&*>U,֣|кȿ!dE$)Avʇ#5ճ(J^i傐,$^zX=Ceb[` \"^ngͺz7fd[z×ZKWC)1WnPz0lm{NjL-I-UoE vr2)!$$] DDx6Q Kf8(&Ed ةf2q5_m^w#mrYS^(,dTJ+e8*P‡: _ßOmkᾝPWy绢|Dg+jz:` jQ@P91@J&IKֲ'1[iV4y6 ~:*OFDYy&gGCG8Sb5@u;7l4d-N'-@R@oي$c~- ޚ.(OAh9| 96KY)Ƽz܇ޱzfx* mER dHoX:\RԆ =Ѩ\sCO4-<~bԧFW̓9㦩_?O 9^/+j5]?`q߲Aَ&Un-PˉK]ZF[R=x!@LEtPC]8gK(4xrS侵͎j.NzO{C3N26w0wWϟf#eǬ$f7A U5ː>{ X4 Cp7|A,) rR\ z,!)-+g$ Fel֝-(y.B>:(oqZC ,?)>z5?]yA'٧ @?f*e(0h$%m! AX^;Y+ 1{EmL^EMmkgi0T/:%U5Lr {ɢL \TD:/Uwi6KXfұ+VV{@4% :JrVȢP}O]KS `j #22謹ɢDv.#߈ `fGun{;+jw"6ZD""V sm|F d8/uURLU:!bh rfOZ_ǘR6|Zunxm*.:I}"4~]iuUo]>R;mt[ݭcN^n ~[dy }ϔާզ}z_̇0`X1lڅ uT :p:ְ9 .!Đ7TB)1@Rt8Q**Z@ )QJL@$-2*itdPR, 6j9z(& (\cڬ;[&͝gQ[^Oƿwc7ӱx8\^s=5 Xd-%XmY592*Ni( > oȊ(*C۸{Җ)GAJWG&#PT +(Jtd*Y7|#d6[M%aן}5͡٧Xz|& )=˕U]eꟍȑC6Ʋ $]PY0Z&[r&{D/*\<T_x|$P[KHḬ9/P֣|LJc@2 y uB 16Un6g UckTiL}9][Gzک%|<*"+^Xk3/GntsFz"d_Z*rN\Ɓ0G\>2{)ZcwR8QlhV U!ɘ~Up. Ѵ/].{_f GGSHg؛~^\b񕪫+W{yݯ=X=;~)5ZeGWb?P?tqxƛeIߙxZ&p1Dl/o=2Blj-{sKGQ nl沨cEz#XGl56G-tتzMnxV-ƂVuVR,>z9feTƾܓr䦛뛞MP[ߏ~=ߎ繞.9[$'2Tx#\}ec4;9_v{{e-XR|t(YI&oCO^T׼eAVU ]C:~@>k7ntPVMJ~w^O}>[CG߯k@]| nvd3:7 p)&C݉?CV߹%B\c.8.Cg%X @ˀ!88IOmXYW?w)ⰲ;^g pAf5% UңˠYm)y3St/ei3񮜳k&%7 YG![;_=L7!yӠ'"ҝp2( G6u&EjJ )xFq;tn5:7e8)P5Q#xt,ƨP"FWT`\#mA_%`4RdWE[D"aeL~ lߪ;[P<X3p{ψ8*Zt䒉^"[)$/R$vYoCASvHS4E.UcB 6(dc@)sRTN# u݀P$sI%@ @DbqK%(X,Jiў֫]u,tֹ}Jטu㖜ȳ\0G%tlt섖C3w|q hk94))h 9E=<"O|5f2 BLK1N.`{("K{E+<'k_FUQ1!Ś V)4.X%i~MJ dցm[+{֑_dRM=#ٹ*$RB8Ь;wéVhJGܳG9K7߮~y2'ϒJ9<qBH/ \޿JTMP?@:MPױ^:%B2!h-/Tp uJC*0IdQHaD !,Q* @M# ?{ב\@=VWWKHX0`M"953Hy.I-;3uqN?>ҖþVWKWW_o&ƗҳZկ4+2B뛭X߮V7 %o o?O^zKe߷v<ճ.Ng}]ڎZݠd{@Z iy⏆eKbȨ`K崗˴$c-դuq%uZߖu}U돆9%`\-}4JsٴOAN(ɐwR <kZfisҍ|3]ԳNM]oϼ|365_)`98_Zfk>E$xK3]b@}8ϔתmMҭXl4Z{(cmvUMrN\=@iP^xҁ'}Cꏩ+l\m\5]tN5"Hvf{ILHّxܔ!pL Ov4Jj.C>i:l}5rQ腤l!$Gf\;0>ty3"Mk Sqi&Psr=Q;6#\46H qt7w| 7@Bw\nH8l6LM9]m6ѷuv[w٦y^\<¶Otj붫wV"_#&ػ V zzTVu;C}óm-=%;K_8mXx˟=J\g'O?|(iW[C^ofz١r>X`#M߬Nη' 4Kkv,?_LضE%"_'ZH6r.l >¹y ^yBI+rlMKlS >7Xl+mHR ԇms!BoDt€Υt|8z |CZhp(`ơ;I@>E %R8{sY{t|- mQ 7ȶŘ=Yh ,*_ -TiG&F'5Sl$/]| A8qC+n[. 'xA_:Bypm_z /߻n|o.ڻ@KKw_m?Oj _p9PWWb xśm^z/XTS7gy*E.+n=9BM߶߮`Oߜ]-Rj?DN@]Hڽ-s#G<1"}&&mܻ (KzY?|-3%-$7Ec^[x ,/NϮ9y/)G{WԊ_͂:W@o61(cüK7\7ՍuvVLW{}U~ S}C'>vӢY>!!5"+)oW,4YOƶ7|f||DͶ3fzل?|6kf[vl!_x6SaOջc&_)mp#GHׄang}I{y$[p_ۿv*NZR\ގL՘D÷$= SR!⋣l dLu^}l%$[rMK%TH2 rum$akc;76L>_`(.)e<#_3 2$a h-(`,Ao VY(HF3 a6?7V1D:LrKaFy8'WXڡM>ԐWoNli j+} !29.@RInH > bK=l.%]% stå1Z*F-9\kN"ac$õmᑩ4bъ<6a kMOlB1<"] r1$xڹ#f ik9Mme_ 3WrS]Ϗf_\$; @x,gOf מ5`!mӿ( >;Z,FO:ԚյQ3y&02 8hMSaƌ ‚Dr5wwLԃ` `TEF5!Mu֌taYZ #!VfBf~Z) "-r|dNBk@=9"FX6:T0bn(ax@\0 .FlnT_C6ӥ1p0r";fhR{VFRC8/.P,V3"!K#քM- + }cXz~?CM$:kQBRz8|KH"24kzz꜊42AU&ի$Bσy. ʫ7*+Zr+cg+p)g^}}R޶ìWUmeU.i0Tunvh>Ţ|I1Ro$K1!@`% ju4H ֲGi8"^#ȔD!@B$" H D!@B$" H D!@B$" H D!@B$" H D!@B$" H D!@B$"^/lDŽ* ,1S*H#@JHWW7=|:b֎tŃ)eJ `G 2R4;Ńx0>1i$(сfJ:Ç(# 1yj6[,ܚ8ۋׯ&[(ou|a/|U\n.\*֛`4}j{{}g[&`U}6f޽fm6vrhAL"]uJܶ%&][2￴&7wmtfs-^oͺChYA˪YwizzԬӕV[(2 8!T"9^wde+SEQu.tw~ HaC+oSkpW`tYT'7J6ɱEh!j 61[ٚ8k@M5 -=Xl_8όR5 5zh#vl8ۭl+46ZE e,*:e=(408z1ۊ6۷pM]t_l!*%-ҋno 7/Dx8 H*H2Fv5O1K/ u[S5y_RM뛄CJq4ɻ=@ BHn r<%ײ. (9xHɰD.-?zoGf_-d\o׸ $&W\JD+(1I1+ F> ^0J7R~\mIw:_MD6u KB ~QЃg mf>,_vC;cA>x[+0'Ǒ AqF0d# zߕzdx{p,=O0}3Ҵ{0t~3( i #s\_vO!`\Y4W}Tr*+1~4檈{'KsmԒ&V NbeU09l)(o}@,:H+DrEZ#=)%ɽH|UCe*uHi5+!%S彣2LȲC`p=~#;?U8Q ׫'kcTK߀9m߾c_,޾|4>XIz4[_Wc@ZMo}EJ[߫ Z纽[ntduk]+*Q7,)%8g`噐H $ڇtyuvxtG<陟`f[kؠɒ2LK$T6$M̒̕C QfʲԴ!. +K2F"03/Z+'3L>G ⶌiM9g|O̬V̾> 7If~b }zy \=DIX})ocӣwQ紁%DkVpT,cdi c:D %h˒HKOި!C^PpӔpzucDH2K]i&X6v{)m(͔Fk _d7ڟkkE8# DT%#j*hsGXP" +3ԠƩ2*RqfDYL2!Q/yG39(ڰg̤igwᲜk]H?ƝA>{zN?ٷFkU@3F+NV2Y_w.\?³LRZZALKq&했POK8B 儝\RpFt{'1ԳUsmQ$9* $qU^@IŵLō0oTPo1(UgsdbAN/mfnQ?樂5vt|u}Q aւK#pҍuŒ֊nl}44w&gΝ;jZw& ~9s}EE0.f\~u/&3|f:Ab`ͦ B֍4I#]2vťT6O0e.>n=ٽ7ܜvluzF]s[luTi!*; Q rSzB; Ub ya:3Cuv~; vJP ~nI&SAW:0G,?4]RTlU+N^_ L??~?Qf>w=2N5"HC{C݇aаoZs]7øڶykٸ*&İEhfG{:ݙp7;e j1%E0Hyw7HbIB}(gZs\p`c5.qnc3>As6LKU_t0Nlpp":gJsR Ny-=^L "U)R/uZysXfmG6 f=xcNC.[yibK7]VUH @>@l%I O4?gO|RFSrD%iC*0W_9x,fI'2ԁg4m{ЏUd(J"R8Z5卷D%>fRN_j9cW4 p`Ԏ PHdži3D_@מXVx[SՇAxBǁb qLMiz\`*[ˋ +XU1cx%wU$O Dc? c<|!fv ~ ۽6DsJYbf`୍c!TĜ7eץtɶ\ʷ^~\7؊{>u V |Hڕm 1թX'o$V|O͇e&Ⱦ*z{X5o:c#;?{SUœsb-l~EO<-m>"?Ry4E\ΏH+Hi5RIÔdO>2A.%50H vnBCҚ I! @!{EL9S)2FXBsа5Iw4eX_'?O[FrsPf#vxL_&2U}Wʅ jγ >~e/{_Zs1f*DrQDin#GEѧح6>@s6bg{LE=L$,Jp-YUȇ|ɓQs \ﵕ FT^Y\ѧ MŊb(xQCĘc=$ef( 5YЁ))R35([VP9I$0"|,ft{Cnn>$Z^?^}]iOvae>uK!qi1f;#/H${gN(R,ݖXgPx iPcmو m)c6{)A!`& ,c khT%к<9=Iv:+r;[zljksbC![Y(x<՗L:hXaڠ PZ'zD$&ГETzD46:Zlv*Ղ- ?jl1p#]&BFvp%Ȝj!Gl\a@x\2t@`Sk̜bJ&]I^_ʲmgsK凧TgUP[-9jz'+pCH0yvt<bHX+wMj$VQ y@KVS7I +%12R(j'AaK.fV@G(00%MSfT+]8IG~I,/xTŒctDd#U"a&1iJ3}iIc^($dݩFZB#Hϓ|?ynGmr&yu=5"/u>Vp&S6ZX0!d.C$]6c8T=ؐ7fK67A.di?ĵ!|Ȱ^D柬zɉMV&4}ߐeUAډS[TFv|3Zا T*TAi$T 5Z5PlQQ )r͉V2e~# uplM"|];p\#:^L>:<ŴJ۾Rk~ǠR HRAC-08x AP tze#'Ix!?@VZ1t( TUOkJ/]!gkB5p͙ ԑR13&ԅd(1x&4ٳvoXNnL64sa.s}ws9UVw>|9ې}v ؝cV\fwz =~Fn5m6M|Gv m?t=gfT*ОGj1ݾI68G(Ssu>?ydPcɉ>`Ԯ};41!XoƳ}msS5"/72Ȍi n fS|B=y0wϖTNC8O/-;1}Ix1ǗVE>"3~=OGaGᬦUN1gܭmJzjm}9ir;uj'!*m@Vf ˺tl~S ?asM_gҧ=򶦗pժko-E/8Qcw|3cvn'vGvC'~ SzիŲS]F=kImMnWhmqէ((7&{ޑeeJGJgQG˟ h6raf< ?؄{GэK| Ux`ń h=JXcheVl1Muy,k\s-UZ[-Ig_/e_igl}ʙYp8mq`oE2dJ@ZFBIK9kR@c\&ٛ Gcjh67^ aa.C!m=VHzSOjwSpΐ}y5聝|g uSDp(`79 6aTW *ގ<ތ.ގ=E{bDN]0ZJ iL"I^YBdoD+/Ay M($MI"qQiCGbh s(4&'ic|ݗ[;Gjkooک>]ڶ{leޠi5ICtz"jиõWn5V.Χ˻Ҁj^2@|@b0D dbgi{0@H["L0pNIɤ,&&jE%oGs,)Uc.FHBYtdT&,y%*DZ3sjuzJs6RFE軈K9^ߌi2cщQE]8>lQlVsKjV+}LaS=X0GWr+󰔼5U2 w7-b:O0CQY  h+t6m:Uzvo\K^7]@&3 N,xmJx]!{*OJ5׏/>6ŷOyrq*NyzOE[7|LM6rT<{^n-x2w/VyBYq=D*)Ψ lUsڇ Z J-{A{, `U5UG_[[+K{zDD{jzJi8)\=ڵңc3\9\I?\-vzɭրA"9m\<YNaok<]!BF_b(h?/?c85I/?o4(t5곁jQW4[;LW+0aZXlચ{[Z 0vVZpJ{>#^ mF \Uk;\C?zz$m:_>.'0VvZM/ٕlAfLPp)0;f}1+;C&*g]JDhJe>J@R*!|R6AU +ԃKxS{i嗋Ͽ"lf_. . whw$أ^ FkSVL@DaD)C TpڥIѢ ] t!.DЅY]rg#^Ѧ?MjlTk9U+}}"FQ  t!.DЅA"+"@#Rt/|H{Rb4O!eBQ2 ]F(t.eBQ2 XQ2 ]F(t.eBQlQ2 ]F#bG.eBŎ]F(t.e&sZW1 u^,wK?3?\g{ =Ti2@~&(S]ƞ6 d6N-izJg9fl f49ZH|cIGQ ,bSRI9RVT%+Bc! &g5 5´k8f0{ڔ^2ţ1GdbFȠmg@Ɂ;Acxq^>F=^al`"䢯c[)DPkT*Ȃc Eʄ^oqwsMa$`wI`"3GrYdv6h*kj6SJ.=#uH.J3)8:g !EY=c_.R3rPz F<{bH(TFݭ3?.-NhT`I5jTaΘ`YY pPaۇ1˓hzh^Mkk}}wKb{mĜḌGNjxFx2|޺ͷ^Q5BliQƑymèà[*eSOZ+XWbг1-[VOqT%nu5mn+czɑWR}2,&ePƾܓNN[?k1v/Xþ/VA*=/% z MbE%59EK͔Q!G--owk-{s$Ұ1{A=g!S+Mlz՘vb,?̒p+r^_~ ,&[ zQE2|:-0xσE"ePX ctxoqpԳӰU.zJix^óyM`0kϩ hl;hmBQƐMQg\^׺aʳWif\9iܙJ^S{M rOјxg뜷1 'T]l<"u> (H%EeЫXu#ͧp^B,^( >$Y4C?1 &>M, ]ʊ}nrov }9 9Mw.ALöGZ/Vu&͆͌m7lGydK#I_)|v:w*0XHa0?Vty 0fW{3JD:`0:W9eTѻ>]So]@>СkaRw`as\ݥd&y^ȿ7cTLhSb2d#֭DbJ\ZB$/dm/D"Խƽ` 6`01! 4!P2  f-@  U\D $19e5 b iMNIȹ[0Tu'vDgiq1m)X)r[&5e>OMuKSr#yQQ%DU(,xo.w"() c0ee/y<8c%k<I%_X`H)$$c "4CԬh%s@*rM>E3w"A%O'u]" ’@0H1;/\B,ZS۬5")YW A/V=w;sCL2Uhg? EW5a@nX+/u!+/}?hfx\':GcVZX&MDe/ƻ.Iw|X2-oiU.f|LҲMs>^K=M;v< 7nhy`nwn66O7y%hIaU~- lu^RIce!p&~vQ.nh8]| nJDǗŗ˕[exhĉ7h9Gb dއ Ih@( bF (&'vgv^fGֽn(ɳ%,] ¤Z%F( 8V(eQ"P.'`uYGWzwn^݁5UZWݳ_Z`߃u w^Vl.W%뚦T飼^(ZyW+Eu~n"5's l<6BhjπRD^^ 5T@;S h\)H"p_T`3HR CޘB\,0I_C)eHFEm" ɐ)D ZFX tvF}[ 2*&Ǽ1k%6߮ӧz BM{ 8oG ejoW"E& U*3}:*,Xh1ѫKfAe/TxI./t_kK*(ƵJm|,0TG+gO5MjA1A2C.9cd Ȃ3\ŀ@tiio uE݈]:˰0[?6'<Uw=/o_U" 19 5djC`6%VԘdDϹqű3W\EY̕kEJIj]ǨAf/CT8QS$ i7rfX'TaW2ĨXyPSm:#|V>lվ#տZB-S@:tz$ K[0z0-AdOe/=KWʠJIw)/jib =(ш⒡D,ET:Zq>^OzrzqI3ZؘtSBdK0|lI>8L"?nDAl' NNu']mo[+.cC!ͦ-$-{s Cƺ%Gãˎd)-'n阜C ˢmKh]6Ǵ^X@/:Ydx*Yeȴ#CW収6@t!.G.ϴCXdW- >v ,SX$xΩ2G5p&.Oc8>hCvӲt5,gˁ葺ydV7nD8$t&Olbd>Ѕ`PϦ-4˙x.m z[hRH޵~m-3bIۊxQ@Atz}>=<΀UʮeK Y-k.(PDL '*V{N) RaPx$ A`1RxNl- }H H-[}~i`%pM.t" L\f`J]5rlx6LF:7QG׋+˞Nkь绶حpCiUϦ${fg송gC/&͘koiˍU?y/iم '__u^<]On2vJ׮5I7-#|luJ[WnO_F}-X)k% *yY)|m8~JNwVYx;n>7O렣IK{la" 4Y&f30 Zg-Ơ\괶Rth\zXzy^csϕo>l6~ӏXZ^ѱbB9#5B̠7.@ znL)9鼰"3\Xަ1֤}-?xu_2ւYbwmg<}Cʪ7OJl]8-8zʵ@b | t 5p8-cQB)֒M#87\h0GB[LLK#S1;h,O"on:\. V6vU"n;KM}lzHU*4&\9Hcjҷo"7z~-j&}]{F}'YF^5}1r*m{reާK-ԽGi R'ozc ^ntAAU\;BUS4L˛_4\Fɝ kLl/620Ff^v^:a, z/߈O)HGy7*;zT&[gu[:&B6Ou1ikwJ!N~+|ÀZBbf؀&Y*P}Q7K+̱z4YLn KE9&qJseUlGF^WQ#a֬rh-HMF;LoqѻG?Љ4%ݚi>YvƱ[q_{jY5_W%T?U_JT8fTӪҍ4:L;=R\M[Ѥ%A;6`۞yko&}Ap]dDvBg؋=^\KcBCoUԬk{փ'y9?)?87*uu&Xv#&TVh/vË`ޑ=F0/{3Mv/Qx=֧aeFd1 Ys9YQgn}b9rà Ur.&˲Κ,/-%Ȃw2Um#V넹!iztVbPLb*|X_VV誹V8 ߨf۾kdVǚO㣸C4|;/VH-^l>_9qVG70PCufOWˍqc͏.'g^//qR-9x6]YnVvIT /'M|u3Yv&Wtnv'iY-,Oe)> =wޛrN v\7w%jͬr]R&>brJA4U*Aߋ 5 +__ 8eys}Wn o4R S5Χ޸!%U8{U]T?Woޞr폯=$Rm ăIC0J>/럷ډSIyj+Mo=[-浮W^3Ϋ|-بExG֣j@|MC[Ⱦ}}*| 7D8'b%,,'$%$gb3؀#Q쌤6̵K݈oqk/DdoA ꝉ& Z&P:,b4d]r3NˋC [:2CX qى>k۴G{L*wCy2Um Y0j9s% #EXSn{'埻oZxW;:w6YdW#Ťמ6([\g>e?,Cn{y()/^iR8 "O]A06tn,kOJZ5eɸF) ,B2Eˮ jCꥲc)ĵtc̋7z=:][O԰}!A'>-E-EPtu C/Fmbi^c4t~[x/]CKs\>hQRP.MKGn]Mz!^3`5^CӁ;zim< ϛJ ͧѿZ/W}@.lW}W({-ʼMyWF*6˼/Ïwܬ~?-5wyg(y>oꪶȔJu(jϟ-J^ɩ`mé+i[L#bnY:} l3Àb'|v72v)[p=z?mg&\)ۮ A'm EԧFf&B-ahn}53Z=8G>/yGKޞ-ڝhAVkV2sVѷKM[]{*ǧƈ|jccW^-F6d3KN r#-KJvnK:o?RowjE^"CH9f.(*d b2%"LBMYZ&i|@(*"6snDtȣxtG~'GߦI }{񺥟$ U6o2RW|w֍7z0oSNJSouN">]-LKsg04K "QٓUTjiUj{--/TjiZZ(PmGh0ߒtc}fH`lQ阼8n ( ]yRh*i0 ۘOw ;FT}MعƮ&uجٴ-[yU,SN@sĩl lÍ}<٩cW`<.R:L2 Q'e"&""8Ċk%V#J+^@xM,<섳! ((3`JYcK2 U2G] %\Hٗ K4EH4҃ީ,\&r}89=v`Cی˨YT5fA_bbi{hnϿ۳YYmxvDĶ*鄗Y! :\I NZ kdۊ.& xQG73ϋWV*i Y8Ae!Q,:cB͘15#'_J_ KV|Ε[C|TAeG0 @րN*L$n 2 ,В?=kWs!K)\ %%)>ېi:/6ODmTIPԊwW4I/XfbϠIEҚ~),"6e1I.XL<R^ 06.Ap4GUٞx#՟CeGMDΣ zR摋zR)((O\#mMDZ "Dhz7F =9kEve>ݘEzGt:/gԎ+muѥ\"**QiM6YHY ю=`뛟vȮasy_'UFuCtDMX͚ׄnQHTClJ&&i<9rMKWp~DGSI sF+ki%s!TaUB2TA*Z$Z` H S1,s"e2qm*Ǒf$ VE&&xi_竚ku~_gڧc1XmG~gL"Nl x)8DY@k xa˔-&#%࣒JδЇO:y. |ʘrMltjLWd &Jʣ:4 ;oXn\_{q|QO*TWmuplo]*[&>+-]ϴpo>{yg_"76;e7]^/#Aco}hjͨu7ښM[osyӾ'6a>>`D,T:tK`EO\|Bs}ߡ型Tw8+.-/NO%o'ŭڽG<A蠣Y1JӥЕF(ḼEkPTIAkS զ橰P)Us'jp&k^6K_x(ŃG3r$oȥH5oIDT4ZGymy~MHXJҥ(1A:I` ɂ˜ ;^ځy0 h:CjMr- q[>@Ǭ}ɾo<`<\)_~:cG#Q?F cv^z''tPJ|jᗛmɚDvA򖹂w٪ h8~yp _R&03Qm}{}uuZ/m4&gkRrJAeMv2.M6G֬Rwlcqu6ڏ7wvSo./dfl}];EanpA+ͷ;뢺iw kH"dmeβ$/^A"D"Խ|l:`e XosϝS!$QNCɞs `V*ؚ7x_KE34y,z'&>z絻5=STf/xK<5%0(}EዯYP" ie9UѴC7%%FP乾V I^&WАRHLԅ+6PR5\$h"o+D2y^} W7p1lr D2k5rV+""Z@XrJ\ߌO_ì?Z ,Pڲ _'^j!+NW:|b<.9L$ %n4^)\!!2hRf9q9Vb%Q_1^q[.,r5f?7EV'u|q790Mu&fBI)e-[7>H`c JЬIEoCLP9Qԏyg=Vt"+?{Ƒϻxi H%r9,ayY)?&o9=uЅl' *EL̨-\k'6f۷wp;<]9u)Y8$%&#p)L?ͥJhs2d-hF9;?N:f__aՆe yd(Րԧ>C$+Ej>N'I_LnNk.D. KNx uD :J*yT>.sDpQ-FYɪzB'"VN(MrhpFHuxII]Y$t( NC:ȝ jBXɨ6#`'ɴ߱ݗ9T[ϣK7NMn:~i`eh՛ACs_ab:xUH9Fy:S*O%sJv)'ɑfNT:5dNyS11\6Ct0ʬC Oߖs'#$1;_0 #?iq3㟿26M4;,H'OPը N\{qW)׻npcyЯ_#miaym:tǤ/d̗xɨsdI F;ꃙՅյ0D`w7WɸRe"fI(.hwfs6&77 g7307* ܃I/)O _Rvk|Kg!%O ,M;R:Z&$%!9=G/^.N8KzRrVUҚ ꔯ~ņQka}ί|] `v^EskeMLw%m6%Hl9LtOe:R"բl׿Dqi h!fXE"R>Mg0]2"$C:3[hD1ܟ_\xQ9sJ٠8/5QIX"2 dU#br_+y)EVG:x#2kO 3Uu,޲`2W"E}>:$`wN( Gr"E2;@^M7j4/;leȥ^ UBA%)hq{2fm5ϖ jרuqN1ϛ??<ҴFZ1?OxQ%cŲM/aL~/%~dP[ͮA捱G(pIx:M+Zܢ5bQ͞Hz::Ghgahq*(O[][Ȥy_AR"UhqQ8On4[{(z[ͽRN+.erN[Л7W!R:Ӣf94׽ZX\G;gfa|N/]ʨo+Z,HooO>^>x{ݝ zz33tVOw^3Mf˛oskyW-H ΆQà2\*Ȧ<`Ϧ_7zq5=CQ"FX+˖Z:z.jP[^KHȓyTq,dz֩V~iOF 6?NNlчEh+M 5N$2{48UswVZܳT;%Ζ'/"o?Xû߯?_I߽߷~>2G>`z X Fjho=4仇v`[zua%jd|6Uy*fA/M*}D`q QA#^M;sru~<߻o]>Yc-TDj%SDrŸ%x_0HG; ˷ wvOn72ZE2Zu_ ЋWe3sؓSCI.PP()`=db|-Ag7>ze(W*1 UZT(1"3$35!$Lhצ)51#F)Ř %b6! 5硏EPZj6pztWuTȊЬ }g!8B>NcDEuP"[ITTg+ݡ'*Tc^1Q1}ӈ?|$l!ln!.}ubi$ViG00f79m DཔQh(0 I3U_1 &+|u,&4Ay׆pඹn>guBv_;tvhmoº\fRtzѸ_ 綃^;hۑ 1d(C #4[ˆ^꺻yfI֢,ɠLLd^XU0YKFH^wP҈`;z oPgҍ:0ndtGB'؈ 3bbt9^xR^txGWTJ JdcDKdBH+4!bX+&5`+4k[72WejFh)^_)=K'JIw)/j&Aa e2?{D-K7d!j5Zr:'=c$ LLԬ2b MUlR9ȖL2R[nDAlO~TMA%o'^IRapsL#QB/>EZ*0I͗:5Gikgr&y8kE*|s!ћQhcѧY (:$(a'\JSQ<ډuq~PytaG#Ų,qr#nMUPd0:T5w6Iq@Mk4/ycz\@gj7`N7* cA*(zHZTZ;:-%>w|P)P%\}L вMBK Jp1{rt\<)mF_gum >FB7-'շ-NZzϖYgkгeY^۳c'[E_B\lX i<С$R ȆG6|l8eI(PHXĨ]P\C%H(*\S6r) _l93RXh"$I% 5fͦgR7?|֗ W;C맶כpsR}Y\泮Nv{W쎮W]/>}'y!/sk∧?TJ:\[wܺ_m-[=bG6}ܲj}NWww"+{|zɫ5m q/UU'vmlM}ݻ[nlhgH}km4%)IWP~}Ѐb4d^XLPY)e4idI~٪jA`\kbT,Z0Y86fhΥ զT:]xԔ*IZE{s̗!EEx:[m6u}TO]]=ӬF| |ޤw=7n[K)5+%6jLJJ2]bј3b`ViGښr@y@Rn&»BoĿbH }]ojʉ@7l#y!~cж9~ZLӟ-r+_:}t"I᳃4M>w_]Rf-?|:}}ϐhA{jtEPɻuVQ$i#:aC)2AHnSѷEcUBP:$|LOtrNdfp5z֦OR s(er1%H dBdAP& Y^ȘQ#ѐĐ6 )0;î ~Z0'{xU-v Ϩcx7xj`QQ%P ʂF@1CْS O3>;U>3~_pΜr>*'X3ME| o+%;^[J;z> Gp4en`\c9z-[[ 1tI $H4]$̞$H4ܮ{5ڿM9Bf "d3^ Tu$dxcHR MN_g۶ t(RɩƳ:*>*3$ :HiM(Zkf<afg mc]h]E\7oȰ[|e,-Ϝ 0^N/W-8$)+iK, KiNXr$dCt*ܶtF] ?jRBǨjSK`"rʓ+l1Erac݌5v< s_v3m= ؇]$@G/,%EkQQ.%XBbLjVJVtμԑlUp|͸dW9D'Fȡ (QRl7B$x0vATvs}ч͸c_!A=RwQc#c7\okF?n7D?R# 2`%7SȠ2`jBJ-B`!kU%FnᾫJ<!JA]nԕrՁ0/U]zep5M?ƲwOr62dy&+#,]eݩ`o1Om}x̝)veJa@Н ɓ ij޻ZfjH>ED^@͍)1HDuHuF61XUI*Om jsUFEM`2[L.Zb&QPQ9I@Bϵ9[Ȝǐ:ީ~ԙOfw0'xŚи ݮ 5[ubjLp~t5zCe,I2,A4QF3ي m|MҖ1x^J>Ȑ M/1˚otvN)@Df<.߁ˆެ|}u;lߦX>l+SNO`Y:c6,JXV]aٙ<S 糲;cڑ-Djde]3Om0ևXYVH(bNާL&/)fSȦgU9Kiǎ0jm3PV;"[`Jm֦{z+ VJl?EM`r1&{zlu _]67,˭; tַTC`lIe/j0h(HrRRUvdEy+o:ѥŬfٶWX/ r}!U&IX"r`AȠmgN{ASg($Od2-a݆`RZc+|RӃ&ǩ,8 _$J_gW4pKmMa0\`g YF: f2( Vv g.J3)8]*tΘH 1Bhdzl%EjF,hPFY}@\wᜉ!}|YHOF-~֞ZuRc} p/yp3%cvGa5?pXy70%^yԫn \?r/Z?0GwVϬ5l T OH-88r f$NfsV'yg"|U2$!rdC~rkr銟8}ZTy(O)|((S,j_Vl uK[ 猨6ixL2ׁBݦ^>q\X?]è>nø[3Z^=Z\^]kgⓟ/x}IGfMIm<-jdo8GF/6{fwFM#9q$cd0j0^ٍd,~'XbOՃ]9>_;Q7̕ezYu畄ԁ-ϟOS,N˨*Cy G }75 7!8P[?~)0=}|I su +EÂEx6J[TKx`ǣ4>;f!#O?ӧI*eS-ĽIz;yC{jho544o1-a?נ֓zc=ftSv֏j,/ne?h=ZH~$W}6b Su]*n\FIVcZcNzz`EK`#a\^zOB2x1+!I^!X191{(ct2%L PA249әپM7㉝ҦYv]oܻy @yc ϝI{`*a;RճD0(x컧kf=Zɾ{7xv/Ԑ}+'+-oxFq9֧\L utV+*Vga/ ( gR k(zE:c/IFWR1"ZWa"JFMx F$v*B 8D<"(L8ZdІLJNWiE ޠ8]iT}lcދHL*X фZ=],@IH(?]d=^'J\*1>zl')SRX K$=1|z׳GTT)N8ˎ}6Nw3̳]>~sM1ӿKlV~f# Mp>}}hivzop(ϻȫ:ONk!(tlN Cqܭq[Cr+k bY4I|^,dE33_=@ҔL*m F L[RCGw<xByT3Kywmmy=c^E2`fv89;"Îx2O%ْc]bSvn6+.tr4y3sdj~= wW=v}mz1]OwV?a~_C}u8⇻g>n Ʃ~7su36Vژh{{ֳ1-n=N7ޤ[&ۃt>٭N~xYo2ϣ`R5Go?a2]-Kyu씚HV ^Tn˼f}ЪTL( k6Mhi첄A&(S թhb_O41E ;m6-g7)O|lS/l z)$)|g!Fw.y AP] 0ˑ^r1 /$Gb=VH2}J+6>( BrM9䌦('\5\sf+ubdLL"u!J ^o"V?8GXbk ӫ]|/MfB苔|r8tOUgWGtxjÕ*誰Q>+ ^M^w=xsOgrho uC%B^ŏ,Oruǭ篭e.tz޷w-;n٭F{_|p#zhӯxj޶PV'biŲ&<)%BIrY >B+`D)@}PvT&ʻbodr4!#kSkvbqoG ՊMK[%J'KP+wRluhgӝHr»o.T֟r٧PڛB9&(t)<&M1kR<29 m9XAg\tr(נHUWLгt$0 츙8GcH|'!QFL^19[0v3~?V"Sto!ã[SC&V[1"YTmdUHK0fD BCK7 (Lee/`Aᜎduފ%!|:ښ7PK8.5YHFCB3 kv*OyL>?ev*Wnq_hDcuY|SUH"y6ή;g _1p'-TuF lutu36ژFީtTC܉ZX\tnqu݂C2u1xPEγZbS߄C}2 I1&5MbDLYAdzP 0r#+D.X\ #(lXWNcM5Mv@Vܞpf6KaƐd.f6135٥ĖY8H*&h9̟:2.. l<]c\#.o4ʈ(rݑ!&" 8$%v# &#Z@ȶ㈋ša1;Y0_si>],-`/,}D°7w";o'<'_f7:Ɋjytz/Ƽ":=tRz;+#A_ \UqՋgzI_euقE^ӻP ,b̯? YEZ<*Rj+ HGpE;?Jsˬ"i6]+R: U"q* ut=\}@B  \}"3X4WX{GpUޣ"}"<*R~gC•tޚUʣf:Y.'ף:[QC}jr18uON'1wAZtwSTn(<`1W%0Ƹc`Z$IsN+s>t_%H4E)쨄$Y%B+1W^Rf'Takƾ6VX*&E2vzv-;-[rW\\;zPNF o$({-'Wu{D]䊔k~'q<9k68sM>j]5*VjuHP l BD0Z[GܐMAerkY#s_XQA%檣sҝLr2ߏ -h\\Jc[ 'x, |fgP=bm\R9-~䧎+w>:O  (RTQˊE0L?j-.ٵZ\+YگťwE1]peQYN&˭J>yt*\ 0V r eԉ%(*eɭ o'Й *md:0k9[klXc.&oTxkZ8Ĭh>~[>]o~*[> gg{skgF yCIs3#%#I Q$9jis,1Jk3*C܋vy!{8!$ E}⥰9!7FZ"UDޭ?ȳށ_M&PF笆4~EYgiֺgboGTWfʉ5S{HA-{يǢCEK^VLnS + [9~6!s@t;âJ΅dLf Pj$5iɊf\ JS* TRߞRB1 p;Oƶz&Ά M;W߷^$6_Mdrqd]y]*x<٣ak"^ӤjrW "9ƊQ 1( D&,&eԑqFD`>>Uab|M*li{VcF'b 3F UNAppn"!$-XQZ|:/'\ؓ=>'LeBxlY"Vr~oǿn޵muM-鮍mz7׸6_| + GAR*oͭm^;rgHe{yDeZ;'BKYR9IJ.:Madڋ KpjK K@xprf鐻8+:8$HD #7x!49*6j鴲3MˮgmM'JڬϺjoC(tFDBI{D Ja:ylu #-C"\\AV,yNXH.rp15)/9em'fD!r4^Sʭa<P o㵍v>fVV_q1Hd+-'CzAFAmC6v曽}>ŷlk bP6ˁxT0Y9;^|[5Y KR|ыe\pFwfkka[V#k?09, ѭDtJ:%v85#F̆ȓdx-Q0hեVgDXJ&uɊsRqϝ6dὡ?hd-/&t> u)cގ35сWS\p+W)=Θ~>Gsx2[ޝ,O=y#Ý4/G}6-3ڤFπU㻊ާ?nq\9q(<0E1G%Q?%;5QƠB::$[@n'rڽD-htչkzR&۞iӒݣ8$0S贈VϣqHZ;_HeqBnm\ ~zJ9?fmĚ\dC"f*l22xI}!]u.wqi?/&T>=7F *+D& ͮ]ɢd^`9;it" 1)Xgԥ͑-1= }\\4_Y5*3~=?x2̳! !h[2>ĨdN ]:A9E {A^./Ը42=+!Cf>r9!c:j咔0XR 8=NR42)cmSw)Ss9*y>.!t;FWK,۔|8,/05P`x3ڛ%բG50D _ cs-KN``)F1!>L2Ce-04T12\L"x,zzyoȓhӄ;Ct$1''0{L%ϔΒȾ) IyZAB1ޟ I!8|cA~aHr|>@~g!Ik@j$spY9fշf<^!.hMGġz)j۳-6@ txJPcV J ,4 RJŤs@!("粇`:_]!"s?xӛ,]ſDn_ZLDH0~um<}ͻH hY")⮧{DctOx-\Mn<ђp,:xKH m-<<=||/=b)C9NIGLj'ς:!9&3ed[ӎk^ x:^<9'ng*|,(&c&(%XJ>{LeL9!V*"ЖnlUDnHEVmE^ylֳWpTx?~FOi~y{i= 3r 84ZW)о*u+lPbrQ.\Wox mZU*eg M\q:Z4$5TLUʻV)(\@y>r.pKgG jTÊZ)nkl6X;H*x[O, %cKЯM,/^e ֻf;e|F*},[*K]IP`EUT;& -oANjo.s&[ EX@9&%HR,Q*H%m2rKRZўv o(SA+=I er>`N#r3+Bm4,ڥj j 4 3IV)a" z=_?{WȍOw@:w܇np{AAPdc4#L߯zlŦ?tZl6Za=F' {[g_2Wg|`kFhi__;s׹KʨJIZX)Ɉ !~klIV3Φ#Oy{HH&҉gM %,S(𴩊#cdEU88A5 EVmv*n|4KB$+IY$!JX <(ԅNWp/A [ᶝ/rWcE/b|͆}66~!I!`3eESyH@d.!H$m-s8TͣylX_bF}5-./Ěpf2dU/q HM'46r]lʝ(q!#Ԯ Tgr 8;[Z_Ӎq` =33h6[ P"Ȉʖ1hCy) (I X M-l=dՅ[K]/AZs j~L<֗W1 bjk1?O:ҷR$EALI ҠJьr#(r AHƏvD!#C2VZ1Aa U$ʱmE13lQ לJ)aS c`|:AcYsͯX_N'Z _kķGnxJ%xsagi[go=?=wٓOީw6m9[[םbBwk} /nEhի ۛ_iٷ{-F7w޹oʌqa{W_Ƅ|#r;O}˾vhns'l|5ב??_{Jw͟3|e Wa o ]+_sQ*^G>\pWf%?٨?<:hؔ~X+N~|>c[_Ĝvӎ+.bgOGޮHwk2|Rd{1cM*[*dתݦ 8 OwoX{z;qu>d'l^?9;_̖+`Mq lh|/M=]?_^L?Wo Fqys_c\Yd] ;');{ݴx{>"܃j )JmOV'QgPN"Wtkd SVd߮N+6^hln5|şf.*V|_OBhJ"%Z8B""o|@JVK +jIOyޭUU>r _-;X]`J B.b`TBp ڒOjző)}yahyR '1^myļKY$J9 *NIXԱ@PU>yWܭNcb^ļ=.{Hkµr2D$+"SΥ0bH b(;02鳂u]cF8z[1@ע A19P(**S-:cdP`qV!/)-yqFP\~&r4DgH2V1Id{f9 ICW/.7dCZl6J__uT^FD;FD HFWFc]3%۬E掣_| 8lt ޳6#"/bU78?OUAkoVBxm\>،XJYIJ|˃‡Dd 2a$=i@j,66$CGs!R [/%gJ:]qK`G%ml@6ȩnWmkpIkC[uYŠq9 _c.v/;_K=Ju @wE"FՉ fc&C4S ktO&|<ípJM V).XdւyM dJց˼)~ )y{<2O( xRË YK|&-͛'<1~Aj}1jsjo^2Xf{x_nq>~Zlֱl. BIG:D~oHo\5JwJ>FEkT/:.^"d;g"^zc i.ӼAYk9zh!݁Ɣdl;nlMֲ(,ML6([s0*#zUWGz X 3$ 'm΁p 3&u-٦KM`B#jqf/rNY++3V 0{YwWޭm0ÏPxA u(k /`?+b^}}Ák)Vzy,Y:clU[klHU#$c,0L삢&2RT5U5JoRͰl(b ^!L˔P;dB XsT hcB$TފVkE='? \UH*!*pQb)oI(勠l iGB kAeOVD·'6rJʻUT?{F}X$쇹;b2|ڂec$Wl=,?Zش,; dUUl,`.qlj'GPPTྕS[ug^)7b !8u0 TKHzZ6r"t 5Ab9"j<]F*K$r4ZҒSZM ˊ!dGr< dqpzMfQϼD7  c\2qp4Je8py:/ ibOX球%K*^I nBm>ѧbQ??f%ڧOҗљkHۢ]ʡw/N7?j! h >[+@'v~sTn9t 菣3Z-5Qxsˋ7Xc$sOd|tx^ h:ryѺ>ͷ~4 0Փ֝=ӮnDg7Vn6b٘V>,oEϯ/LJ٫`{]M׎o\:/ұ<җ4Qf }DA5Z d8"*yk.kIBz¸CeO(2w4;?]磰$юwGlTvSOGa|vLL'~|ן,~ӿϟ~'.̧ӇsO.)ǎ!(#<_2oߵV]IpZbY/_x~9~/ZNc+BX. k9ҍڋL <|s,7Pki~tK|ğW|: H1C1]"f\BBs4jncEl#Nt]V¥i>Va]h!H:$J:Q#mo%ӓ 5_pSNg..PP=BA$ =>SnvFIcބU˸N)lR8N袂6Ƣ lJ6#Yi+!c۠!ML,Ƃ2B %D ]x !'w19Cj:AIB77$ (ϼ;)v5>)ɁxӶ^"|! ]?渭y,)3FsM Yo㾱0`˘~ǂXV@|B ™:eAtQED14 /uJ.%TYV*e۔rcpޓLՙYpzח^i:K|7~@zy ']j m> '6g 3?aߐ . @jοn9?@t恨6[Բr'ޡ|aˬ=(oܾJXp7sg:u^F^xTTrޚ鯚s,ini~th74A7!\q jjjk RF+ž);k Tlo!~\W04@&4lA΄ߒUq04>%A5쪎#캎#쬎#%,j6ls X*'FxS5CZCGO@v{V%RJEvPGTǖ4jΰd ,X`tZK)+|edVC#M_|tA5ɓd$wG0$!+I'oKH*)0K $L"y \waRѭ Ѓ#8_Xr~Az z.$m8>er 3*IDG  a3X%2[){Xae֛ТgMf7pqR !^x. z |b<6#L$ƌ$J- tRJ`ZyAdTζb\buGz(εČvȍČwy6/b?tTpK^Ioif|~]is7\pNJI҂ L-8m _JOMކ,0+7kޫ-ӸK{P zQ_Ŕ,f?ЧEـYӇOC]kѯ)vuE%I;)b/7~q:sEY;4mw6ע:nH=THbvvֺb^{]4ZB6aPq\|IR~{R!> z B:,o%?2-/fN6WHLpu<<8;_-3i)dޢ'#?͚wIb,sY"{ bC .fǚ56cJ$xni{\bNTpk|輳"hYPgU3S^ A*NSN½w xu^ف!5U&xT@O/⊸pe+vKQ*wJ0 #ւÈ[a, ͞Y 9Q' *3R 8oۀBߔ J2"=W1yڻbܣ3J !jNI.ΙWQc~*YDnr۪f'l9y>rl: >}`8VD+Cj}Rqxl6s:},?LлS/N5@Zo:A+3«JWc,>bqx j L 7;9H9t @h+e56ēpM@S6kW7NױRe6rJ8D&2 H3٤ Ott[YϪ '| `F-T;^im 2#rmuD\+ H"jhx*%n",P_ r Aq/s)SX\t$bZՠV zҠ'w]@2 tjQ#O<쁎M1FS9/RRhGWhģ\HfAMW5'Apc:'00/Ћe/JxVGYd拶:ZMճ.SS|i|նcOUf}lRLJ\2*äB& I{.ZQ "$k0fQm<;2wh5IwzjҴMQ{0.=iD8Y9$ &/l$;^Qk0rjJ^JZJ^*eZ %ÒZ J^'4eJY64J] tR;rc1 S Wx!cy5IF(1FH)b*4iZp\x##FEb1cbPUM;'<0je61xo[(c F|pʚhc%Hܛe:rӔPe]iG[N4^zm<e9cφMӖ_$:Ib٤HQ+"$%c'+$=$u;o)@-}s^Jx6SVsׂk4օ7.K}{q{5;Nk=IlObvFttAL}`4mzɭ6v-bm5=>O; egߖO[c6QadӃUG]2Is?O*')Y[jÆX1zNm=JH2ۦ}:ŏ&mͅAnޘ*9,ʂ5TurDrzT"DŢr?K}\1$( oDN9DpIMp\kd:h,8%~\J0xmJ>4b:s#/%t6jY{'=Aajz2z<ɇĒv! 1aL2I A!Y.ٶĥHt|Y ATZww{i41.[S]~s`bʷgC ZO%$1'6ꀤ(W8 (Ee t^'Udhr{!P傠'Rl j#pR[%(]&j[#~dlOWi/- UX(Xmx TE3n?][Q;?@їAWƈM F 1 r2Ky&5Fy#vk)B sQ0^( -M&ێB$!e حs?b0]Aָ+`-#s`u%&zVv JjۭZ+;WD~va**ju*TV ++j.(wݹ;gj^fWWf3O8!{L+8fÛ{tH9͙GUczRj4@PeVv `ې Lj:L*W{~90 F\&U!W]B-b JiV +i]*䚝BLTZ׮^$\Y-Fʶd:$Je_eXI"$prOW!zUHܒ(:T!QT!=x:V0+^NDUTX\so 2YaYeEh:aEOZ EI=q)q!Xih$xC@%Pp}$ml 6.ѴKKlx0:*THEB+HuS1(LYL84V$Ce$QL:QŤql/,Hc TL 0uɌV`ѐK B"ݴHZ+ X31nm>Uu/Acep,*¦RGd,= j9 {/ߞۿѴ IRF dJpeI8)ǓQ rW*JR8^m"ɸRCR„&!q/ɜWA9BHsQۆB3EsYj{p܅q4:Y&Kh~Go:o8jwmb sީFϩ7Ol-^AԱ҇i6K?/8ccFW;)C!G2Jݏq`~2C8tI:ɔ 0 l5?j$zcɓ]eW&0i=V+r%[?N tBk{VK=x9]] #'m{k]Z1MðɇhoL0ס@cf ?tN[+5j7{f4toiL.ήX_x}1=fU1ME$}t`Vv9?_LAoK;HWiI#5 #B5*Idcj|,XNppX՘Y vI60W-1 G2i!!e`EѸK_LRo{4!fu T΢dΟP-a-__$n"uZ|ƿB*L[-ju*&Ne6pv$$"IXPs)23꽁^md/)@ \LAV$z@JA`98Nڞ&5MWP!1 1P|R^xZ-Ixl\"EUlZ}K*ts<6^cJkcfHWˁ^B+[OP v^ƞUiw n]wѶ^Π;܁phؔtϧeǤrl`N[pFVPI|LpFw &inh>i|'`'^;X6twrn [ٺLDԭDXJ즓*!w/0D]}DRxJTV )|7T,9WYK|s! +9 h9S7Q3:Ftī qVRoWTZӝk7gZF5X?L)wzY!X띩^VE+˔w@ؾz ^F[oVjömž4&GfD4F+ڈiUK̆ZcR9Ly{Kj*] kqki͡z.$ ?hG|ymQm؍{$٧ĸR^om # CF$Gh7>4Zp(.,2we[U68z %U̅SjB~tl4/-xL8pB`EvdTrQC"`.^.{ݩ>WPl)Ump"heYcy佖`eIr1%"(i$Her)t&IB ̶%9ݎ xKA"}`-Gos"nOlE,v ]Y>3?uGOgm)YB9G- 1$-;<[Ԁ9EX*ɤc{n`:Yεa*8t6} 0'Kh48k;=-rz@ǣ47/|h00lмî[dYJWlղ7#8TAC3-WE\/o#߷YKE`_A!ٕe%ȴI)8D+Q@$ID <+W$EhCU7A1Rbi $ Bbern^:mKV[#~`hR&M+)tm%;Zhʡ j kYx%ZiYIL1Q!$%AW(^r[܊qmaղLuj I)oA$)f`M;EE2JxVd2dCW:ZGG/Czmw"Br&itЋU["*|KnJdٲeRiml-M.9(sJL|zSGcۇSюj+d_ЕCCɼ*eXY눚4mf#zi  HM)D;# )Z6,{pW a'UOUSN2UAHdcݐ+޲0VH#;6 >ƜD*1тc"'ChIݬ!IV(ܓӦq;yX(mX#:Ӳ'h|&;[f١@Y-+eiMEv"2?#[v/ MM o3ƀMhqnB)Gi,>bbv񖆾\^trhqIRs߶덝^_%\Z7Ժi[os4cac[gZ6ԲYnz~jҡ畖CɽwOcvdqEYǁVFqi.lnM樕hcP'ѡeJX9J4$_y\is7PsOYL[m&H͙0Wr {-pppb*|*5]eZRut*9BUSn S;̐cW+whZy3 ܐ0J=pk_Vg{J8;jtה1Nj;ZTQk=3s=cmSq:\V ilC B$  ϵ. +o]Xo$b1֑ڐdՃʉYkc^o{TohV˺zV [tc23R1GKY5FI!K )imIk}5:X~tQp(.^en) ݅d/F/sj>A6:3yViJI&a!@px u\n>ZFHbYS` s *޵!}9yGG-7u|3i'o]F_^x-MxPS.Fkq-sr DkLp4jY_|U1zKhМakC\&7<Ae;uN/Y/i99Bu{g]!͛<^|ׂٿ2wG%QoQwqԹ6_F+/}֭&o o_R~WPO+M2޶mMI-)=l hvg%Cޭ/['<~8jv;%l/;\@Q9rk/1.͆.<)ru9H}et.CTۯMMScWYr2_OfpߞgWeLmy$bbvaMK\lb{~a{ִ70KR2Dې40dR;x0Sdatx^tzb{Vx 紋Zˢ9q<(>B @BMB8>½UsREP*ͭ穾/*څ'UgLQ*l?$DrEZu Яü\͍=VPcyu (otI;KCP-[b'4JrMBi#  Viͭ悁nqpm.6r]yRwv,,2W6q%Sr<PT&NS }]=e#Sj ϒ6٣1|·GWȡɣOy|mzƛ4/E_<<O; y ClqOqp53jW{BpLm``kOg7ut3ߏ6Dz_f|^({ҘU'v`~/-_ +Ǎ<?.#lk֭B1ΫTڮ{)ss쾆 |PȆʂ5D3j6'^N-ӉKW1 d^«@eZXd}y5xe6zF%c+OI,,r2`!8@dL2#/Nz WݿOXajt[ bRd1Dah \KϹ j#c5qv#c=[V[bcQ9-fߋe!mmeaq'/OAGFlBXt*'uLa4RsUd)1+d^2T>e[$AZ8{!) lJmph2v$d8c,[x+.橠vq*z-%usdZ9D/YJV 8?u 9JlU3`$N_[~ɚQTueD="( VK Q[&dJ`&z r2 Df,]J Βߨ oH6\gWR_&2ɓ`$ QNk&zՁpqZ>4VA o՛&VRWdN7Z9U "|EUjM5AG]&7e+m#I^ļ~xö!D^!T6}#Ēh$V>dV23*+#/lyOkp74rdS8ԜsW RsA 7F贰GѠhQMVChZz4e+X ~4*豘,QnPJCx1W\ɜߺ Fg&nY]x%^l~i߭j#j>ޭxuTh|-$kkJL;UGA 7UGdkaCY\+6CYJnm~*գhbh͒%כj->7FŠQ&iK/c!t̡Q,\gRNAyU^o6%IګB« /5$09e4I-vc=JdRPً lڶﲿǛ;nzw\]vV[eta-LI0&$[d la-LI0v$[d la-LI0&$[va-LI0Xd la-L.&$[d la-LI[&YLߐ(CO0{i^ݸ8/q\Pw᭪1@\$ۡ}u–-;EagG>κcĴ9'q JL~$9 㢦KeUq.s 1$%:b$J")!gSG1'?L+Mxb]jJm+tH|{Zo'2+<ű\g },煠Z'>R<IRco&a .,w]iVu!-_8-hjmvPNP@gM`wEL V:Cdkbm-h\o 5-~5(܊.VQB<)gBlP'Um wo;{x1}޿춠U<?C^X)dˣFkt.:³(lQBf0 1^0-d1qj\\0(` <1؎D:gG_VחCJٗWt4zW\l=? S6C]P|f=wWg1RHB ǹ*D&ΡF󤝉㷴ve/o-%6 8󳊲m`r~cC}l-g \sl׸,U3(!{-!P$C`ic:\B kE$q!n ܊q𵵎~p=y[F%i֒'Twô:iCdMFYP9U ,z9%U8 m$G}&91@P@`rNəVSO`2nd2Exy_[kGO: Яt7q [ߟ6i^Cq8U&SsB  }nSƌ4-nyz7TB ͉ߥ֍uŒֳLk'3UtB6h޺5ϡ5omv0.\_ag3jfׂ#p4zdj~ŷAuHBnm:euU6ᨱO8y.>,&zٟssQMr[lVuq!y`Ѹ)ww=(Uh FN3NK3t? B,茚8p?, E< UX|9/gGI]4RTs Wg.WuJ74?~????RfN?ǟ>7|.svҰGcP{ pg}u-jho14øvy-t{\k9 1S~1YڑvtмxICQc_lbmboӾeGbIBϡke-M3M,"&ZMm A,BPqzHA8Jq e!иUⳉLH .osxQ1ANw:8:f^̠;V[י}x"rk"ϭ;[`UIyc&mgfZ>('QQZ__ՀϢ$i^(Gx"Pt\BD 9If(31FDGIϓdJ<\g2\ A\[',G 1C$8\! 10uwWPs^]}@+ 4xZ#"ą@=TTN@ǁxNhJ4BU28H%݅ ESHS+HXgeFF 6PꝤ\ĸ* E|tl B286S瘔c0m0&Gġq@,H2P x7ўSʽPx^/S&(u!}!ՒǟW[.50rNѱձq- /kߎ-hk * tAE0MA޼.v;QJZԙ 6@]m!D)/@_B6%48nj3W3 :&A ,'4"cL̐\ΡMDB:Qr)8"ҙ8;y^Ky+;6}f6@]x+i^&#]_ABw{01^{'l]cszq::]yg?MزƖ׻w{yd;)\[\{[.k~4L9+FWx@|zb㵻Q]./oԛm"mϭ/kŒ1OVUd~ZߌHIR!"gP+JƟx)&"x"ي .L5un* *H**&A eItW;uO G !$А ,AkΤ/+ qT"YAw]Ķ3qvO&G7UvoR o|c4;"ͰFK| |nw_>7nImEf(LɉR9Qu,h*[`(o';Go nG}ٯ7y*๪Ro؍[mo^ȿRj}DtP`%є)jNRJ\J)ܜׂˍR봋NQjE]dmM{J&NKxLׅ̞S%lt‹$8aҸx9a&ʥdVѰ2-$7:Pм N)ggNB?}wys1a##7ZrQ=c6 +1'YδYHe݄-:+8d0Pa'v#ԅ H+)f>T֝= *ks,*n=ֹN*NΘi(jLSjQOo1TuCOY2ռY[TT-Lৣ&IOɓlrT:v_XZ[ja-j&կlc<_M0JfsydzS?|ҿU.ڝJzFkwuqЋMieɺ{6#"r=KH0f"f9,|H^I/O[eٲeeO0!Z&Sd񩮔PL.?v^lz؆||~0|]w$dbת IO??W|-jFS6Wk >Inx~?2$$./r[qܕNNi=W-UvcޖTC+#иڰE%.F>ٔ8gX{ m6e?aD?ne鿍&#ZJNaSF}$՚oSi=fJtMͫ+Ki:cSuƛ^iyc+[ ^T9SRDt 'gOE#Ufpޠ3*Ih 5%Ztu0b,gB2{\834y4iF U  *ŵ )G8|j+Y1},YLN%}#/>:5H$0* U-WcX9mʨH?e_:l^ɾDU\HJ[i,u?Ji&N&\2W@4zVj^4Z":ٓR&Sf"\{(p+52aFY0X*e,i~9d܏O^y@2Co˻oT&GUE9N{imJ I@aigZGʡ`s1VΦ=t`ȃ.DW?9H"9t @h+e~Dd&)q{ر%}qȉ(%4^k ,fIIDh궲U3gK?ko9 r,@ 3&%Dz:FG.4<0Em[ t/oeρz ~BP $!ĻaJg0}GTuAtQT a'qy;% $S!@&M<ăhHmd yQt"%&8AU;fATDfL9 }^4)xYgEtE, hB'@lmVDȶIjJ3uPTe[ns(Ĩ,FeT0IpϥQ2[ >R1dcIƚ0>:6Bv-.^Lx~d蒦dyۥsOZ"Q2N } % ~8vˊ*EpRHƢ lJ6c%:] XXMn+M$"-[m$D ]yx !ró1Q@PD ".4\֥ȲR^g]`<`Ȕ5Ƅ,_hR2>W͜-!tsH-px~ɴ)xkE>`hޙٓ[VbqQbR`Uiʹ#Ȉd,u.h-kbfPb%ph,qdz% ڈDC(1Z%y+Sjlql úAO \%g{-$ẅ́x9Wc)Эa&Ǟ;Fv-;kVoݼT(@RObNyڎ燣WQ|QOǓͳv$=\|'Y ӬѧEk2IR>9K[2tVG1z-M//4\{wyқ|lݦC^>?-%Tb~0rn?t7x^_Żmb2v`6hP/tRK݀Fj|1Yz>!\ oh9{`.4C/9 3§fYf/hfaFJ&$,6V+h1=+>{JLyYuNgŨupx?{7(ψy {q>C@(7GC"ԃ[fyʓXʬ_Oӵ=u1Ӡ .Oh0 $Mx74= _FFцb]kq X\Scݘn*尲r}}9VVӻz [FoDN}2`ˤ+`!D\%,֪Zqˮw}\Ëգ V`_e. p(kɔYi Wծ]O[ԖEQ(}ѠⷋTߨ|VJ9Qj4بuh,? dam1x\s--XKjnkz&2~a[>Q˕Fe+/F=.o82r0[[amòM7 ~uх4%+/R^^V-L]MϱTGG7_XkO[JuLuOjm6,2/ʪVGM<ل}6aMgل}6KG%=F9=Is>=OsR=RQ׫+F*:_oQǕmmE{ z$M>bo_?i0ϺK#Ptp6=IP 6ן޽i%*bHow;M5b1e Wxg;fkCPߨX=4G,f'[=FCV d3 Ca"'"8&8[424UH%<)Pea)2T)E\hu2G2tx `jlW'i5O5=ٗ'g3 X.31 2I@ir9RUFyCuL[,dFߊT!''B* aWk֍y&A-s,=kDӤFAɖV';ɖd%O-}h6j kݷys;YQH\(-3.7*o!x.7F9Q#CF_&fiqR:i"#c3\٣傠'R[l k!t:E,ۜ+p@62V3adUI8bcbҢҶEa*tojw? k)dm1 r2T Z"/2 :9,*p2խ"(@{. &RؔGZب3vdd`X!ePZ˜ۏGq%<Ԯ6:U=*Pzi< N*ƳLA`mes1GV2V)7֒W 9P$}e1hK&G/83`H0AKLb*x(Xm|슈XG%#Z>aF)z&Jt( p"J*:[57mFeq\pƓ}4b"j" "iт |ʈX͜4QW{i3wCZ6JvESu=.[|#Tv$16 7<Fq#f"䤒DOcρFǡ՟–G|7Tvܪo %]EE\2Y)P|VF7 !=4i`8RRVF#p7yѰL?=ۧ>> l1l1l@GdC(9!BZ”Pyq}4>iB}1:Q萲v\ؤ{Lidf]V>}u$#:N|$S>~ 4kF+B uԀ)'D\Ⱥu9{69z뷫?V+~Xî?eƝ䕝2^l{BLKr+w-z5\Eޕॴ]HCjt3We4`YQųhc o7n<',~DЂ fJ4:l3|Pd])@e< ԸoaxO)lȡY_Ҙ*Jk~|7MD,{҃N/h=~~~~-\/t;?Foʩ:4>rܪf\ ѭ }1b/+4MrJuf ]O>MB !׵ TݓBR":uw|}i릋zVCƭHHTv^.i L80 D(%uVe$VRODmp"h}Ðep5RyGH1)c4JH(+MAeHYrcrYFm%\XDrF&n"siR&q]h}:,'ThЖ8ٗ6(m͢*!6bv]{· 89GUNE̹ᙑےD ;8rAiraсiH9u5*{d9GkwI3ɪ'^4WqPYesx-0lHpfڂeIq˓8oպjI[*d-0$(mwǀ\j )c$ppwvV[8Bd(\F>4춼c;{U kz2[ rPh(̜ 0h4m>Bxw&:DSn87LE{΂>tZirϢ!$3a8c9$ L@SD=@(ZUą]uq)Wd?X2yQ^Kk9>~{GwȴKÜ;!x'U 册¼|9ޕث392 "%DUi=w7Y7_G*)_ONj31)G('lB.pN?eo 9CS5\~2䍣Ut̮ad7 E]^difR6г8BTuMleF7inSMCͤQ RM3Ke:cZ) p06K7jvQA}~7{=*EɏՃwwӫcfֳ37( Wf9+ɑhQ̫x4tlsp3nᦱUF:dS }EbB,UeNǹ%[Iߕ7Nl4Ƴ &rݟF]y*KKq /CYG1p ʞIT\+AtPR=`4{*7h"/>]_xe/>op^ff0?GM7mX[Mc{q%ݴfߣ]erC?s|uH'8T^1n6M2Pe7-^˃T|:fEVt `\H$g_jWx:0\*CN^UrƼ}] m-A]:ҁLQM/ ZkxfL(BB k\zЖPaIp+,]Ϯ牖 Q 'l@=f$|cqy.ɀlj5#b1LL451'(w[ЦHY"!(OH1h9#;{z3%D:ཌྷl-OwåWN⸜ZC{\ mWc rSdfw:\ʢ^fwWI'ӊ[y8Z9ױr͎vҙ.ϐ S!߮.foo'9*w]m͊ YE-6(/_Mt܃ʛ!Wl7=z[y~Ap;'k -B4ݟo@Pdgw^mK_YJsIzC0e .gv~+˙[Nq9s C\ `hkkHۿ~wm+LRH] BD Ǚ-%:(cIgl0QȔ30p,GLƃ` ޡg. 2;ihoǧTN9ʬΫw.Xyc-7/f/r:^ಢPcs;LɉR9`8ThP.KVscꔷ孃/Mi\xk^*&KǼR:FY9*w/x˥HHS:Q S#_IU;Q'ΐev1;uqpV&c{+pb,q'H'6Z"$Ž0E2'm-ZG0PGhjRDF *t2DGŽءng+=.N;+0|Xoj dT‹$8aA<( ZoMF<8'W\J+EDlR*z>ѰI_'#V)mZxzÁ۾P$Ze;͚9x0 ʬ&07kM6Z+ s%\ZK“D eRD"3<*tG,ϗfy,dq!`e"=1Fgm qtU v˓uY/Wo[v&h迲x->*Mq19<P-#JrQ=c& #։'Yi ֿ;!uݵh}6 bINQۑfp^b&[0YuA/=-ZC $2(\z G" g%>HP$A:3t3~Pk*D.yxpfRq&pt T YxFK/7cgzVݐ]ۯ_@aA2Y?W'$|}' HuF~DHCycM59nbYte{0te4F>\"? nHm/ݎoz՝;㛥/x/e߆~HdFΰIk-՟I<{}>{g6|_V>:, _l@s(k}\5C;iOIk]ON,=!ZwESrpY4Vcd<_mIKiވφ@cj#v0] 5.6bR\07_C{tuƋӏⷍgnmyx<9N(Wao۬co+ o$%eVbM @chnfa11ꌌT*G{|D؅?UPFqYs:͂o&83_ifozWI ³?\GsBYC1 A\$\jYY)?aK9W乐_A){ o)d!x̅[^K)mȩ#F#(T!WwTVՔQNyJ A`%)tѸ1)T9)8w%Nʧ5vv| =|š2Hygoa?L+CS ?-nCnzP^*)%* Aqx:*az8_!g/]ൽI8X'ؗc} ӤVhϟŒ-J9-q5U_WUYYLbRȺZU%t*:Nw{ڒۻBm,ԉrv%jVBKhO:.2I@*Yۚ ^Q6!$@PrXٰ6FX dmȼu;ݱuFΖr1[5o+VBki_.X+a*$`0Z]L-: ] /z ~d1R]"->cȼVyALg7`2FԠjujŋ`' vZx3Zt]BdѕmbQ@D`&TR8:^Ŋw;Ŋ3fA߯oM;BDśBVO4VGE1%WwGNja)Aں0mm =u" XVSdm)-JlHhc-RTAj28!1l Rzc5qvvCv۱tا׍5[uهg4K7D ]$ Ѡ j*yI)KUj٨&Mഋ.235dn={D8@kK,5UDQe!'w(| !YL n{R$D1&`Z!I&*YrD BVt:73rWmݧ=RګcF4 BCWHgjtSTQ1ROSLrjkϯ~2 _&O/lnw~]d\?RW^W #1€k{*&%%RU* y\O0˂FsHi`WﱰWZ%WJ-zp@  \1Py8wb*- \Ql \1ˎOWZ}J\=Ar +&@ \UrWLATbop˥#aЁytz$xdzK=Zx$Wè=+|\W.RZu@p%k]b{0pUUPR pUfW(ZkA-u}3kM5==,p<_ǻ( sbУ#F糋ŭgv]~4R6ʍhq" >~{VJHkpeURl~1u.2 ,\Mʳˉ^$މ`QL(kN{{ߋ!>NLN̫^F_;,`*1+Vœi{h~6^aoo~z3BSTV hH| CoZdK"w; zM@&K( D`E7*DP3ʤU¢|v+I/Z4/)*cDZ%WueUr;KRwKRIn䞠%8*`LpRI\=A U%`5B \UjUﵫJ%WOo#2b `cBbTR4+a J:X+}T)&Űߴ)t(4ϕЗx *2P׊9K6gH}Be>\NƧIb_Q:5}0NHwTE9+1d1(Z3O֌[)Skf'Z5l ^Y8x$IJ=y Eje<#vLœ}EjSC>'"RrX.Iz΄ 5Z3svT|^#۪1fǿT٫DFs.5("zaV5<+UU%S@|OASU%M!?Yu *m416ŧ-Ej7K*kI~իNUOoj%W=ݣWg&(mwUŝy5e7!勫w?dzo-·-dN>@,v«l$Vo +R"ȑ;EFKXR*!`+CO`*.(6 ǜl-_FȹUg슅c, ()vqY5 [Ϫ_z~5_dd<- %SVҔLY&EԶBa5X7Ix-=sM5< I Ҫ¦- KXضc"s]s?b4]31ڝqG_P =hw6eA;LJFȢPmCdP6NGDJ6֪}c| T[5Ú!JYD i]RalTu/ȹ[~{>?\/슈cD"xI5)%\SJbT*cmtQR` XKtݶwvE`8RW+gFͬL1%k.5ycD쌜w:0.Λp슋1.​.ٻDHl%$$I(njF᭳PJ%Jm.;㎾ؿc74_U[pEEd ]=O=OZCJܬ;y0LPR *ZP4rH-xȪ?mmۖwqأmG=V:u(PA'[DP@.ZRHeT**{YRlEeyϫi=şq{8&(n[EI7^3Mmg Gr9k;]WXm6 ˾vƔ@ 8" e0e5v;YqLV:]3Om`dXR1L(M2zVR28 s1#mJXg:T)Ed7jCkbV t`E'OAkm8 /'fr;I;#gKrQE׸]yY)go<5^'6t/f rEfhtaq' IPȁPJdȞ![mY{U *-C1i^PT 7uOl ы }ftV\3BT!E , H)nZ[60@FIƱp$,R^f4A{$}gFƜmeni4 BF_uΔ BHZ PKGy u5 mߠ>m<_߼pW{rדb?kOdN,>8]5?L`ɘQX pR0'_~=z秉}-=9G791 +m(\'i9GѢ}6z^>zG.~Z3qtVk㑟0:_0~8,'+]> UQ u~$7]e+'}g>lR}F;e~y-_|8;QAmnWO*s)B\k_ygW\q݆*O(29XGTs@9+ˣ8>{Bȷ^U>T_}sr#oUGH0&`AAgCvyм?2򬫷^N7Oǭ-qB7bƑA{6k)%}:7Naﴏn䗭ݧ"2BX]9[7.k %Cp֔J=@βFFΆ4O v$Q{xh!\QZ{cȰNM!blP,/e8Lg/.+O+Ȳiv*JNCyCy>ysZ5-glܺޕ6#"Sbi^@9eik-K*IeWկ`d+%Mɲ+˭d/2HrEIѥѐO  j}q }Zm=o/zMjv"k4LmuNwNdx@fe- ul߽M<[xLLDmMzJY_~cA@aa3Q+@ ~l}WrjK?4)q4>o*; +av|M[^<Ǐ,Oɝ/Y}R"sk* ޕgJM)&Duy Tdݺ>R2.tZYJ3j Le @K3k {kȵL g?yΩt3LZsd;sY 7+):rm3`cIn5]Lo[e9ӥ7S9 AHUV8 "[ 5>:LL[LRcprv4+sُPfſiU4+FϿ$kv\M l/(_4{d)`_{|؏ٛ2CnU:=2O8z-G?{i[GQTg_yNu (F-)8eh,>+VZŤm XD94G`6F,&P7"h)2:UNɥd憚J ">=@@xvf"GwV8+*Q^ZV>qy<귁0Wk껍?4tj{ݼ~@+z@Y%e{MQ.=]/NûgM9(ܤ*vw]z1 \S{ulMu3Қ[pUAM;f!$ؒ,vzOz~h=ܽJfx{˫ߜ==?^=Ͽ@%qd6YxIښ/2r{؆l9iC:Eb\;4ykȭstKKPY] ͔))”QJޕh8| )|+8S Pe$'(W2vƗpéJy©KyJy I/t>+JϨHTkR!"JPVP75ixFK%02\j+3#svCND->w-pxG'jaVV=i|{j97_ౢQbh\7*ݵ,[olQ[47x 6M@e.e1#KĊb]o_{[bE1OY4.՜|20M>7Ry6&Feq#i;66]tԣ~YK+ի~')Ry^g\]jnUI}$ݗ^n{WGFrw(jf\o.$^˼]^ndz>Ę8˵gf6=>~z?uڼJ70(ITj6aaJs?F?W|FZ|~{9զJ}`|h&CD%̭ Tb3-c!Mb챝3i bNQވjrO;(gPa]!{N{;eX.u|[[w*~4q뿌F?@Sxxlsw!I$c*`h]y p,lL)r.7xҿ[hzCZ,:ZA(P:60]XN|,e ү,Sם jx0:s=41CLlwbNda6>輳@# 2ǛT:S5-~;OK" z+o@wIch9' h[EP%)!gLp:JY7Uk35^*].<{])ԟeDRF478эG~՟&u#7S -3E`QXRyberD58ؑ;+Zy2G F&bgvKQV: #$40eVZ5՚3F7"yƬctAkLDs |4DcY_[5QPJlh/#_T6LG [Ѽ_81Bc v=,I=1:aYGdIc?uj@s;ܐ$$\| dDh2]Zjmj>P;4zN: v>簱y}]+}Nx6﯏䊭 +=*ߕIulpL=6Yi@&hFG7]e>?O.iLHBl+ujIE}GRIUʀm=YOJag5+J_J| AuFџo'h pQωMr;`~^l>ogaޮ5kel [ԖSCo?`V>C`H !=s_#g|F|jo.*z"}WX{PCn{Nnmc}춏 ʭnWA4YKKfbzA֍:j0/96iלLξMmq酣AhWZu1g9|'Yq0cb*E  (^ ѻĉf5%eDZ)f }+Ed3y) '8^\ǩ&ΞHou]uȅqS'(3!( ̬S6 RiC:E,ꌠ=u 5)f`P;dŤ3HY&/"I2+d)ܣ 0g7y/89J^jzr׼X#yƳvEa+Y$]s:NEEލܙҷ}RkT-o^.Ez=R2d`yQD x p,A)DiNVڪр͈9$P\O% i8}IHǍk39ڞ8=c=RVӌB]>xUEA[+/޷?vz9@_ƏF/ydML*DIL 1Zi:d1 52/j2խ*AfU8 yL (nS Fm&nGBf\R2*{ja25]M;u{{[#32H'](x!zRbh0:~41(1#0kC12!CEGIMfE# 4G>`&Dv8aKy݋cGwG4G=.,'Fb:T00!CP(Z-2cu@XE8i%rYL.?H54 b$&]v0p*U$@~qXեjZrWboxo<RH*ƈȈ7lWD3*'@@ܱiGW!> Qq^Z#w>03.2FnJc\%i:%zJmnp5q.Jl25ۙ튤F""0ROa~eKDwyy7<%fܚp/iD {BւDy^#ѡ@yvGsxL'<+YP2kAMC=793FHrQVz9Oا@ܙv3"^NeD\l{DH b%/?D$ߟI""r]O_O ;PȮ%b"QZE:bt:(Ү9OI尖I\FE暠k 6AO_IXϞs>e5!w<О$Ιk%r-]ߒuM{p{ sҕ{iKDŽ^hijx4a(Mt}ך{T{N$7\3%|  8X(+!NECL#V%"J4?Jg0\vK%LHdDg*&Ξnσe6ܣ/܇g !nvUMoj9ވYy#G(K9-LF"qyv`$d@UI34nFw&9΍s$t] rZ!t4ғQjg&n ވ^O&mڳW~9(3Bψ'{3Omv4j F)yHp5=ܺ@1ݰlm9(xXy;'~4`S[o9p0+?|.#H> `Ӳ3-AYBiʒ =q;8FJ.たNi]LgO5k,]()Y ī lY\'7q0k㣢Ci<+sn>'%/ҮO~J3ߺv~\x^޾=ԐAV:D .~cEXkتc 'NJo0L(qpG.Qc~OցMaT+aN8'7+ݷ782Oƿ\c;HVv?6%kN5#6̺| 4`yeG=t8sN v{NnkuK_ɶ ZҰ&7=T X=ørCc8Ocp:Oݻծox°LCE8W&c .JϛR)vKW io.˃8<;!#\w}OG_>#.ݟ}'qXNmAF{p鿽e4lfM[ijzh׺]_[ڍغKb#Ax±}sKj:z"Pك]őnU# bDҵqNv\,s gaFs^lX:wiaaٻtnA ꝉ& Z&L:Jbn4d/TL+Enĝܮ T*r{0켌:*pVvfێ֯a; Hj* @*M F0Z6}'ά>aA L L膥S HsǘHd0}Y:e-C7W< IYPt{iSRE>QϤ4Z7G1bj#u#|8Mm xə*nC 50U/{xˎKL֡ 9 ?"ٱ`ˮe >J!L2}z( SUMɧdtהr0ȧ(eFsNzA8{.}Kef 0i-T#+(UbgRuuI|9!)j$".XbAmI0ML?Ƴz6K>ElH|'+LJpu:w=>m#9V&r+i{bb"G52I\Zкj],ZoveQC}5e-[wmyxwU=7Zn燔js܆ǷݻC1Wwt\edz>rƬ/Y~ݴ+y/z%2-СL@ְzD-y)k~d3X!rF U`ha)1H)h{9JKf/]ԛXQoܦJ{kGDYH*Ro/" HJ- 9ZJ,sƓ%`tJ=2fʩS,yE\Km3TZ"AbS>"=ѲUw8S8 Yˬ&9ϑK*P!ӣu6z8z~33'1`bXXFBVM ~Ow[Z8SUrCeѸxz_?qx2)$KMq}(mdthe]b__I囖@eG!WYj.?*p\y8 _?wh N7p\/7ǫK72V])"=}.ׅlUcvyІ~Y`xc^);%/%'jܠI@U/xkmNO8$۔{OɌ&_G@=1jvQ1S=)%}`1Ƕ$S6y4ݘMC!p1#ScI~qR%Ӵ g|})ѥSƿo~}fv/G;t*jeX$՟p>;2Mi[fIL&ȱR=oJ#ϴftxWzjxd'u *5JT;Πo'UĘ-ؤJ(L葹o~bo{09n0Coh|C_L="H|uW F3j &\\Ȉ(2G,泎yy,WQ")9ֶ#B8c`eN/,ػCā.^ݪ78!^y:t+R>Opk,UET3G #B ƅ-4NL -: ; -:E^t"|T2%VK)&Fƈ*A~DxłA 1:!cB8`1waGl ha[0A*n6r8ޗ%^o&֘Jmz4ۮ_ZcG9ʷ+G'WQ7un"^|%6G#V0\D` UDjqV,(/n!DzkجJLsr>+ =+kV-]I#űRkod|쵲2aDX2VQKKUS8۹I*6xΨjgٹt|e0ބ9 *t7a 4"esO-)R(,7uX)$=*3hOv9oi{F faUHڊεp:&GbP1E&}ʘF ]Kq$tBI mC $0YflKl֫vF|& Ox8_(`%$B`}Z_M?Oݦ"bKk7\Q:J`G1FHKKJ%\ɈL")=R)S%s,AU RFGA N&nBS#RSߕv$`qPZ!`ɛ5ɠh$)a`KU%"䞀KPs!UR aݚfl䴔R1ݦ0jl*a8RK/cɭ@ V0  L -^rk [8DZ%ʂE2"$ {A w E3V6UGIKvT ;Opbv:Z&“`R{N 1Xʫ8GiA4Y,g4Ͳg56Ǵ66^p(#pG(k`>E'Nft/mnDPMYW/*|<.m>pb?\Q1SiԚYO d@{'Kep}W#*"tx yVgFU֛AlLsFRH(CBlr"gIcKgM>v2Os]6=!g~a0M= jJ\ 1=ER%N^&O/2Ear Wby.ǮJo09є`rFpVl ȥ \%j:\%*EWoBL=TS[DM772:V/5)tiGiդ̺0 +z;9 fD@lYYt}el4nM;NH´en|pVZ4{? n>x2w~]% )b:!)i) T e0ǟ*.L6WK|+[߻ʎ8tt4OLd0iPChpQqZf `Ͼ|X GC.~ _'UDz"j$wۃbJʠ×dU;}WըQլPioYEU#~yFp /*Q('e*%-lJ {6p%\*QT*Q)zzp a gW\z6ZAWJE{zp`L䞏vWD"\)$̈^ͮɝ+OF\0kT aEΠFzta|Zh{›z:qOkЫKK>L4))"nZKZUb?\CY[p|B&Ё^/i=&[5n#A˃g!bt9g;1{p*1? Ap^/F]KM9b>{gI,iV2k'=ǒl!یWޭV 'd*ҦL"NqKk%a<5ZĘeXic;f !hԼGx=$yp8E 3×zuym#ï#,ܦL,`d= 86SN6sfݶ֌fsSat˫c}0%,Ndv؏Z}"}wQS{Cc%8#gW܆e*Qp ԈpPvVU"X+B9gW@-均+՛+[8;ORū[Y1[`1Vwj]́ 0֜vPnFy􀾤r<>X rM@8k(ڬǟ/<)wlfMzBj"Ī?~rRU X( ǞX,f";P eR2\~ ?+|(oŽbIn?ŃxZ o}Zˌ `Qx͂VL" KErl,M#=׊ خWJM @VbXBlxgKc'S)4E0ai1') c~c ,YydCgz:_!P=_MCz{^ḨZ&]uPwmR8% wEٳsfIstj! X8.H10%؁~oktufWHȤfC2#l};I)G1jTPTZDJg=l4W-P^f$߈&#mvZEo{jPB]:=n +Z#q2Fͷ.Q)Z Ȅv%H  OQU0uk)& AŜI' sG!!.A D5t)Ԛ8B`@9|Ȩ ~p!9mL6 _ b7ڛZ(Sѝ)E4 B@,NRl eBRAA^WW ^r)fҒy,%5S͐Ȳ(n&}EՊX{ Kz[t7hz .USW%$' cEUJ=Ӊ\30`w7uk坟SV11󃳣n.0QCیZI|tx݃Ky@r *}tl*MT$];-6*d`fP.eQa?XP|Ѥ`QBą򠷒!HDNkk*fʇ`)C#`8ӥ`)`^(^b u6B[!q2p /,0PFu5wNu2<&[WB(NCkdr}qo-~gCjWمX#W`>^b= eDA!Ay1i"jPKB*q`(#(vM i&TE"<ѓf W nk 9%bkVcCX DA5DC7HX] @K,brjFF^ZF!/D(DxQ:Wn~63)I@̴LZU ה!P?Amj DQ;Pf{0qRcCe t׈Bj(crf:PF.otX+,*j$k4Y,p(m@ [SW`^ &-w;$aS6i` nQ}AojL!zse_ Wpӳ5@׫e^ȼ[T$;v4 Pw]< pff=+[{ XQ-> U-iVn6fY#e-6n(g=?.2#lRӑpPaRDHZN!F Ϳ/`'r5wn+C,J`]PLMYځ DGCPCzuz\ ;fc_XA|-`E^1H"^S:Ԇ&@u#']䍡"UA  ߡ0ˢ#ɡb$UcbY/xp`\Sclci0Ih j hN\76xk+fnQ_HkҬUVm(|Ϥ@LQX BVcږV#OG&&KoZi.!@6(=,1Ҥ5>0hlCiV3 _B\ RčtXzo*AنFp*qKކRZG7$ЭuE< \T*\OR- .1˱XTZIW IؼPb٩:ՌHҖru3B]wԢ`|JL0H5nzFoVcv/nY7۽a\+j61ƚl=d-g?8}+ێ^_]FKˈ.9ۘ矿= ,NR6-ֵ^_Bݺkbפ[2|'P Nct>qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8NYXs7+ SHns1ov\8[|}۶Dm{ -;sL;z{+ah O!rOhwH3r@nq@At(rt1:x9)q@JP8%(q@JP8%(q@JP8%(q@JP8%(q@JP8%(q@JP8&9fb& h('18~=%ޜ_lj;싫jefל3Vxjgoq,2c۴}Ry~B8Wn_?ryվt|sosrsb<]go.oo?UcW~K]`/+DB/_ЌBx0 |U΀vӪ27=_.^~n\gFL?\nr 8:I-8 =%YՖ)8Z sOpϻD dKtIJeҿ1q0M:INA/wqF'm**–kNTԡǧVe2wé[ uEUKKJTѦàF/FM]r/ ZZk˒r˔X9)V QxڤŽ.uq{]^Ž.uq{]^Ž.uq{]^Ž.uq{]^Ž.uq{]^Ž.uq{x^^{w qmwe ^?s,=/FH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 t>)N]okWz{5ML?+x(ԣ؟ KGl&s/;7No-FM_ε_MfD;y!i3@X|Q0PNpi6R4F*]1JETs{cgDW|?U > u_Yno;#+ sRWL̆''::F \ؓ ]1\BW6CϫQBWGHW\$?'`glOW2$#3ď/]1<}Zt(,a ACo)~sG׮ 3/?Fr'gDW 8ЕSA)5b%9#+3+Άns+M:]1J BWGHWA[=+vƘZ (IR1U&?#`" ]1Op#s"b1:]SvuttfDW|g#FuwNBWz}QVvz8>Fǡ8R~]izmGfCW W\t(m#էgZx8_!d~?w>'`YF?-吖>DIDR#q&$Pyt]uUuMdl\4?=bWqtȭQ׾ Mê^FяVsO n\4OP\ƣtѕq>ts g)|.OЉ_e4hL.ôlR1H/DBS>=aR /Bba:-b.K] |_ ~ۂoXb٠PhJul\o{籤EƒIY|fRp}!Uu1Xr LeӍ쓌6JY Z}Yb$VKQc2yDn1,Q9 #-Koa1m`^'QI?nwYEWù1k Zȵnkָ;+̎}pm[ ъ[R^ ])Ph"+x[ >!+DiiGW/Bo%Q+T[ JtB`~te` 8h/\խѮ t(Y eZEW(DEX̩ejq Vmࣟm\yCd i ۃXGW="=r=(P ";zt%_;f ʩ ͤRJf54po M#ZNӈit4,4ͥMt5o ]!cVRtBvtJ(ji]!`EZCWWFF0ٽDJ<&>zG5%|v/RR~s"++d[ jtBk ;z9t 8g/FD5vDk]JK:WK+am+KՎpi ]!Zx_wtrjMm]`"cmAD";=?!H.NW{>eG]V)a?a |]:TKuSS~S,{pUh[ڶ44(5h4[FkLS+Xk0)ڒ x882.Zkr5b-R&&k8놳YeúymL.!4k敁U0Z08O/z"`RyNB/hi^HK^UtlYo}>AWWfX?+1D>W&a$)K.Tategd Jލߢ^t9 oFPGݼ[8.PxcbD׿=#hmǻ,mqKji٦_ `Zս7߯J A1R ӑ*XjuZ^b,XEb15TD ,wW"/pQ|' [9&U?Q@0"|4D߂6*b"AlT#7ppq\5MHW'qOAʹXRU!KDֿly]kl4W{Wx 4;&Ku6~P:?Ab|qSh"PyBW6[*a_ѷAE!#v*|p\O|xm+wGf4&\`_~ n8D23 u,Fb Zӫ-,zRE{@AgE ijOAm<~GI%Mj}â*_ꝁ^UmFKвx/VB JOw\oIS0>}Av$ɴG+I%o]*¼(V.ӳlT] q94 =4@Zc4d8Lk6M/ڎI GnT`?E̓Ѡ<[.pq4ߝ5>`ʊ} HwyF)<%]&̦ػ6trsךԽC?FG] J-9۠V^J!^?|K+t0cֵmy 7O֯$,^kSArSK@=K-7ԩ{N:JeE1X̚~]u<\5!&ԿeW*$&KF 0cU0HW!q*8ʌH뤹9=9eZʀzR0H FUQ*@Ru3cmpftڸ68 e\(:.|R.\!¶Mgn4񴹈ŖWeXĝlǗA936XR>M<&FeN&cAsE5>#`hq9R =3ΐ6A DF U9L653vmpfl?5)]t4eڝ{kQsmќ}p\9'8٣L%mіgkW|/6k9VQa*i:8[<Vud={kֻoWR[caUAp]q9:rj4JӛG2"WXFE`2g=Be_&iCYBYM!?LdF\jHTQo5"Y&YfITՠ Z)˂zDak(c$r89IĤհ,51JqŦH\#58E$P=?k˦*p޶'pwQ$O[FnΪ]LMK"LI2F03IU$p-K:JC.8X5Y EJl//XYz0c`Z%a:'Bk4fDr0g؇pJ@p?Q x8&ɺ8Q"8u0RHT_%.{ZlmjrRiFh #?Is-ML$HJ2j090MZzK0YXrDw"X̌Ha(O$#A)# !ϔ4 j3)ww-g|̝3ޅhϣ.V<(b~~ӭ²>~%z{wFs-<9 "RXNhøȹ@ćޭS#Sǫ0% ; 'p2I9 {]sFϳiukU;Aȶ'oI5omf}c5lLqOЃ؋GëeGu{gZ+#׺lZ肫ZK %ϏIuLq!i rU?+CW tu~f b/._A'-aQDwQ rOSHo&F۽08Iȏ?WO_#e{O` 4iu{`_܈>z6} XDPuer4%ָĹIVS0Vf@!0;鹝 Kr)&9 RAruV c9)yS/]L`("jEV";EM^Cv/؁y%tG:6͋!AaSmAC8H@{^NnkWckF7Ap"Aug]AuO-=P}Ѝ}`}+QNi nzc* If-ׄ]04-aR1yR6b$. 1?ʁơfCqm`j{N.^fF& k4'h&g}qM~|,6=57L[0?*P|h)E.hb+9.k!$ǛT8Cc$Vt!4"}`뙫 %bi/Q.'RTY 30M8k7rh\P*߱ 1*F`JBC5gO=_g|[7_ ѤZKN)HGak[/@՘#֖Ҡ-UNh5ƚ%7Y). (>m4d{o.WoGNp[PhacґO͚dJ0|lI@%Lo& GѫjiX9{QCuPQVHoD5ix?HeF]De8}0ѳ(_-Z峍o˾zD9eEAɢaTJBdB%jk 2>A=کǽ3g_yv@c>kmh"͇t!Iܑ(!H&hR͓MlT-զido[B䍩#@ՙ\Cjm"N7B eC}/Įz}HFTL8&'>w0|%\yLЌM3# A kpo¬vU!uO:y$tB,dY\CTYZЦæЪ[`##~lWfD)srveH/R z3 hh8eGɄ(HHXh]bXr_ | XYŔk]>gEg˙!EÐ{"W.I3v>[~qu=;EU_ u_qW8bX*$;GwmlӜO#;-wz0ɗ`eF7n|WwM>y=~WL={$xgU︆ߚ密{ݻ;6v쏛oC8 0g[CnC(8DP|@~ {.P`g]w rL6 --?j߼P HQ[ Q,#ȵYd s)pujBy6q: w VVkl4dK d  3hm-TBCB{=9Q: $)mげ}I>?*bz˰~ܮxx?i~e-z-hW6T]l*I"닷vb^i`>aa+{1; DB5y+셀U}"0ҚnB }Gs=U~w3.:ݖvKT1pjm+lx*Ȥ|/R E>7Vby*%%a Z696g:ri޿4WZ?j__t|{zw$|潜M4e9ٝuΕlK{MܟU(w[@の!6_>OU7./O_*tuBshmُ'Q[s~vo,ң^OG_yz?-U៺7w.ǃ;w",!{hI5L/`3^|g,-h/=+'1i?]XOb,*{xiNYU!Bzy̢*lP:3\.M*ۜ1dH>x2" I1W NCI|ֳ 2]5ߝEdJ&i,> mHsPJXc9` :2C1zurc^Dޖm ۪xzji"Hs{Ƨg~~9\*W?bbh[);H:w1!/% NU =/+2C/h hgbrrj+Ei(I  6DJƄixEIb-Ed!I)J.2(m76(@=4vqO. ڿMd媭I뵗wcT ]!C;7A] Ĥ-Vi}<c#B{5;Z%+%E([0{Gok*}y•rۭ:ڗ5:B =x%9!=B-fC;D\q" ".,t2$Jąuo/RCF\P7nO\!*bJIv0W\ qBF)N\UqgUiײzj0՛A8F8zFٻzc^&ZH/̕y2zW/wV+ a!) "? $zo5.I ==3zEwgFC!Dbݓѕ:ۋևJ(9̺:F]ɳ{Ӎ Y٥֡\]pzuz} o/K3gϔ]Vo/7w/MٍizixapOm` >O.ޑtciK sSP-}ưIW/pt%9F@uu"pAt 0+XpM7ٕ~\Jf]au+؋'Y<>xtN[t?\Mv%nٕP|1*XLzZ%ۍv8xt7'] pT V^t%vWrjJκz܎MNQPh??MGQF=c]YWOmzM^tk B7\ֽJh?(YWǨ+fn7K#/49ShV*lrƝo/VUp_ܦ| +tU=EoEBmq}>aK'E{ },DC!s FwN|ނ浼n-s<ZݼF6`_֞Vipu2"k\~:{CZo]R-1壾~wↆ1-e4,.`ɅL~ nFSUޘ|s:]ztt]beKd,ɪhYB0ɵh(g5r|`n?ScQZZkR˘갔"P!紬V s>ew9wًC{ꎖ8nkTeiAh҂PnLG楅YZq4 s7J(hQSוPҼz"g7R _KzF{o~lj^ -qͷW}8:[o!l.bo/oQ{w:; h k7oXX_N=-r '_|' +VS$r«ϫ=3P??hE'U)TAW[W~['3N~}sYN^Tr]f-/.[N5Qj',~S6}CZ/Nܛcg }h'<-f Xdx~cO3K3nR5uTMhy29U;TyHWn `{hSוPZ?u l%PpI+=Α(=Ϻ:B]] pntܗ=}OZeFf* ^t%~χ e/3"{k{z>L;ʮKY&] _MO𕫏jp<(p{Nњm2N,#tg]=5SzKMr6{©3Fg8rk_s5 `"ߍлuJM]Bi̬#Դ ~ FkѕzOQFJ(ѕ Q1u+ IOYWG+&+ԍ7^tڨu%f1EbOcpE8ZSוP0u#HW+ಢ^t%ӟ u" 0qGpT] fJh+d5uRkӑJs#n֮u%~ף+ޱTi'h(`cu5q|QƉD#tųZ[#]zߍ[%+$;ulK6é02 f©1RnThTHlu7^^4-nJfMn#Yz 1Yz7 M5qkUaguy0ਐ wި84ighdқkSG#s4!t5ok)_8!o;:r7+zku"rSv~:u%ޓwWSוP2Ϻ:B]ywJR\o]v] oJ;gWǨ+\OA4\{ѕN?JPWRx'$+LUEWBku%4u-h:Y] n?WO^WBhկFWaǦ 6Л==ih݁G9]YWOmzMHW6.t+eEW:d嬫#~@>S̙S6>{ 垲J@B/}5 Jܬ#Դu*HWiunJu*Y1 ǎõ] SוPuur.Ӛku/Z7 (7uu<^ 7pg2(:`%YWG+6POٕq/z̳tڨu%f^:F]  VrOGku%v^:F]EVLGUGA5] G9f]=CWqǦKC)]&] ^jEyŋj}Wq⬫6fxhpt] .w+c\sndssNQ-|R')e$V6(`_5C[k6!S/PVκl6f :h*)Q@DL cY\ ,G r "6 EIWz+eߋ@L]WB9fg]t $V+uוPNfYWGWSP Q}2@]poT#J5|Ņ&_ZF} ߝ[lA}s.AѾz UY ?UOdЫ_G..7h^޼FnByL{yr_}*K!EΫ?[}WzO[hC/c A~uǯ{u{|:jSnŽ/"P|h{ȨxTrBuU /WG'T˖~85x9p:GO4t}~6-w߀ ݶem?|u],Sn_'Y- C(Zд&Hުt^E'KU,|w{HCZm[G{7-ӛV_;%Y_.%oRQFja n9&gM`=eNYr%bpl4ד?IfU5PbJUal38N9WS5 Ui"rбV4FDMFNCNѦlVZTm1ZRm4 Ε<8bfp ٌL5蚦ԔbDDhY{mXB859ёBESR+{3>LZI-E֕6fU3 7ez,h2ѡL2:M(] !UΦ: Ne(4Q !(9zD!D@D+hEҾL7߷7YhSEI5C1#3,sr>Lf'0Լ{s+^U4򦦒ixN5 %%VA CrC9K` O:P)F$Zt%>aPƠۄJh#ptU>X C 6Zb䗄2C5_YϕԨ*mEҲ^k֖"WTNA!5(hlzHV)Y3sf(R>խ *`5 Q!SӘ\Fq;0Q1^f Vn |$rcBFc n-fkр۶=PT(%ʭ:dyU糯Rɠ,U.!q 5(}`W)^m Lf\s |m IvCG@PY*aCJnEȢMAP:`UdPU&@P)e,>R ;-WW j`Ԝ&EddbTӄn$ YG!h̠(׀)`s02 V|C g3 VXm`+2!]:|g] c3R5dowVr2nȌ1Mno]11H$ D2Cq <+FP1o0t+,,bF\ 18Y|2cLPTXIНDN%+?Xwf)˚\?悏Y`}-PV.`ࠑ !ZIh|TxKHƛꐅK=\DYeU2)3R|BZ2Hİ1S1y 1&_WXL3{՜J$r$cZ%aB6FFm_zO%$ sbT_6 )kڀx )#A ?%Xx,ҩQ LU;nx65&k)yY+ QkoGO;U`^`kL/ JiRCR=yIQDJܲH <]uVS;XUcg^wCLm]LՐZ"!VX@wP܆sB^ s`W0y/e}t7@Q@R@"0eP!Jq VAhGcM;f4 d:hR%lPUM9Ռ:hܱXK4r['dIwch:)k3 AZY"AkK\!tBÑ$4z`2oltnLv ,E25$ў繨)UF bB9\=7Tfܵ1x7tJxKTM:39P.=ds_;0V)QR(]&T*d`y!UJ$Q5@>N((j ==ja^3úec4Bz9_+@1\ƒr\$\1;Bv%A^gZymeN؞PlQRXjf'UAл5uCl27iâHY>zVМ"5R&n֞@jxAT/㢴g&ĤOD2TSEce ᄇFraǠ\U8/ڐ?C=xYk$}#(qU94$k +;X&A;X;dС 5!Wў iDF9|LJKF|P)pd9 N²MCgI%nK .󻃵9mm!$rU.uRLYY 5HIDawŒC\z Bm i.4]{3!z c#jWäNt +=z õJvnRi ï}9gi;zWy2w_,5 m͚8!~,%r44jv q}YthyJoXnd1Z|~!aqrn:_Hg ?~!1M+^,lގ7F H'G4'/S7Ս% \Gxsu80nپ*r#:~Dy:Uя"ZpH -ZjQA$+$4D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D zI "}P$HC!ܯP$} RDFHA9""H ""H ""H ""H ""H ""H ""H ""H ""H ""H ""H "z$29(]!.C!Z)BDNȠH D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$+&=$"WsH DkO D@}뭂[vMF3$0f747/O|JO?Ns \r6XVl-qȇN`;W8s:X1vnwwzcJItz~o_UOl=ݛ]'z|D? yi@_gX}QVA)UHC/2&{ό96M ڋSw۲(q7Gߵ䆮fE} țVHBy#䍐7By#䍐7By#䍐7By#䍐7By#䍐7By#䍐7By#䍐7By#䍐7By#䍐7By#䍐7By#䍐7By{=m8dl~esrjB7pީ<_λ?JӋ (/DI{8ՇWuO!ʍ.F+"E߸*7//bvqu]`棏寛of@Vvէ~xwA/7noegSxL)}eEXqo|#thIuX vb-%mQN)X\bC#4ij)SR Ю3{XQlM[q YMke l"yr9dTMX0uSUypn -vB".6j-mhgoIٵL:]s6-.!P:|5tcO/q&]ۣC~A6t(_fCLK=(nS!,^HiPo/B^V0d-杛/.PE%DO] ^t y{jn`wmlXۡg1SC5PD ^*cјTB0\9RhS؃ .5&kt2 *Xa@8X8 ^0ܛIB_dJO.^jv5H/|[n[.[?ԍ1{sdU1XVlH Ő2kr['Ҡt N5;>3lGs zAYZf]Ms矿>6dUɍlSUɝad1oO%-޺}ahi48RU~-#gYPLtfsƂhIm^mwS%^q ̌pfN kOͅz\(>k.\Gmqєo6سvy4u~"ђ36TLZEp]+yspu1 tfA6ފ{U24`2F!EKԖT mCU(_7ɺ3`McFĸ/qԌhΈ2"e{5@"Y-c2eU.92!cբ)@ ! nMTb[I XڵrBI(einmHo.r;yK͎[*/}<YlIimOZ@e&{KgrFWA=d^jTfU]X-?y_{bJ2_mShmщe1 7׶QYz焓m'"n;((F_}H\:|uU%pk^lhbp[u7Џg8;b9l gBzDx&nn۝"J*un[r'Eтw ~ܦ."1G+VxKn[rw=ZfTmkoթ\e1)jIpmhfF ,3#o{Kn-ߏzy ~,V[s(?veTrL6n+3v9%WkG sN5ISD9A;1qdeKe6Ys+FG 7FUɩxHbNgvfT ,/$I%a`<,Xa)HH-,dTJJU_Daj.]" 0"[p b̜Rb;,?ڟh:A nFbh$h/I:])ҠP'gK_9垑21)4Jƕt2%>{%-G;z2sIADmWwZtIYg >_uvB|} (%p_NcqlEa5(8,ügn]~J%ܘA 'Ȁ6P>_qaL})]>N%Ud?%'Ԁ\7s?>I16gXr*ƒ' 0f颀kڷQ,KP9v>5#mef7o[/bQCр,׉[/2~EYɵ{F e> nhzh^Lkkkm۫6ٷAkZO}MxtzY7;aL_m;"֎/]:1;GRzH细;AwI*ecL[D;Xvzѳ1Gšĝ "wc[ףΊy?8 |~^, yPƐta֩Zo;m!(Pۭ^}y;B^k|#IHOc$_GEh6.YY)g@?%&ǃ08#&ȏ?/?~?Ƿ?ȅ>-2OD혂S!%0o_Eit?Ps ?`\5cܟ[\[؊+> ZFK&wŕTnIŭ[ؽm,]:\Weu$Af܇bDZ3.!ga8X%)mB1}"`Eqb#=a%\{v/iIV w9&L HQ_rSNgW]I\&JLS线({qWNW;.<[`U 0R'2h==+]eo 4mA[k+vcPo|se Q&Tݠ .^JՍ2ǦW(WxU7,ښGdBtJ.%'=2w%]x3#ʼ_{{wI3>{q.%0ŭ^S{&MM/G>^p_ϗ_56V2qXQKL)=zROS d Ha\  -'@BGLB)!V5Su -[+;n3rV βP1:ps"ӡ̜V35QpcdJaL"^S LAIڀ,/eRY7!A%OzDBO&8,9^> BOهH'[5Z#sPKk(EոAbYoC<{( oRjkv#ad%TC^6 z̦SbYk'QEb:B$,.pHLia%Z`%[U\ڒw/ɍ/?+-co# UXy{pcdIL+_eR]WKc{WR-Z,E&)7xTMC` y Wm8E%]<.oҲMs5)QN:]oeόF{f=}nx+]նˆ<05~4YڱAV:=Qx.{,x ;k(]h/lnکkP$jXzDZ֏ГEps_A_sƪ)CɼYY :P$D !edoŨbux^7*\Rs&{^cA& >̈́J ^"xe0w5# |tT-8퀡zXrE\yYLpGE}G?SaȠzq+Eiank>O_jԻҙbyb4ur --%+:h^OPaQ!=%Qc~DѪ:!S**FnK'#4p^Hf) aE2ĭ6C[Ӝ_zx_C΅R a ċO/G 2hAmȕ bzLxՂDQ_~]:N"lMJ7B (TsYJI+l[rGyBܺgτ[7gaN`L8<>0ሖ6WZer7<<_J};|ԽoAR&'PEY3 /t M%ڕDeV2@5x$yq z|݅ *0Ȅ0/MXnT\z2]^J2[)^jR+,5k[\h9)6 j}mJ$Hmu@_<2<=xu^L3>taVhj;fK; L]) &es\yHgPoXc)DpdO32OE 0$%"fV^90+ܸ2sWm:Ev湩RΗQ0jl+@US7j=/wTō$8Ϣ3^"FFfmQQ"VbSfLJ!TIAԅ FyϹe%U8糡AX."tL)aPʌZ "y{@8f&Ps,;iDI8rV/0! !lS2Ҷ(+Yg-_}l屇PK'ृ3)9o.[dM(g^0T[OKXMf K'z eDV: V5^O5v;PICNM: y^ѱ)2DD}}]c_n=o`R&q2[Ġ.+968V5J^so0%4V4*el,hhYi)YWXZvαFuȱFu"VHP=EŊJ+U*tX/w!RS,xL-J .Ht)7ٔwEdPJ)ֆ,7m~R2NTm:C[- ݥ(*=cgM|ꠚ aO&v6!3AL)NJ _|kԀ6F9רTc rX,jS=e/; /c2Kypdc$>,G1(CZ )g]o9W4s~`63"6Hؗ]~Êm&TM~/—y7$p烕YK AHRp \LwZ:0^ˈi8zjj4"dV/1͋tntIk񗹏x;qCSNs2{j#?ޡ5^NdgRcvG)ܽݴs.-Xrh ]3I]:1ʀINC1c7HuFnnt g+VsE.Zn_hK0v0v{ lnq:.82Ҩט/gM?\@+O} R ˖[3u) 1͟4\IrȳFxX>Ki'JIIF{im)ϣHgWI`! \%i>uJRW/R|QjhI%\~Ac]Um zoZlFZCS{O_߼GӦLD=¸¨W_Ĝ߽ J3iDtMB:L')-0ab¸>#J+#v.p=v5*IIIW 3쒸R \1ѱpJRW/8VB3+XRv6pg]%iɿbHRj^%•.N:=vRG(i;3e k/RE7}1pbrFV*j]aR?/=h0>;`G}.Vc5o]큥 n|ᗿW֘C1?.zKa/QJ=5!AșqذI0XzJ9\HZHfگ!S WmOe4OBb?zCۼypO0.~2S_<:h84{8sWbL:D""qJ )wZ DeVH =EYD#Ovw߮JM^aO`ߞØ" ¿C&jDAWɇۻ;W _<h# K5؛ܤ>OWjcI_N?>Run7mg_u@R*VKPa;/lv^ -^xꪰv^ ya;/l31a(Sos~>9rI\E%zMomSv^ ya;/ls ygv^ ya;/lN*lv^v[a;/lv^Kl\`:BGY ya;/lTLREa;/eQ/zY ya;/lv^ ya;64b}N@`qFI\}6l 'KFx$eHw%=Ckd38c V U`he)1H4/9G1NGߛƴ[ D:ʠ‘H4 ))PϙRzg9byɰd~CY.b"X,xGI4 em6qv{Ew Jcv4g,sp^{Ӿj[M<c#ސ'hCZh0&f)X`u !-"e_& InE'EC kB$0K֫~u~g͉ܩ{oCy{*&mG !)M.u.WǝMZNy;k'MśŸ[ۖ󦾉ّ]CA>`ZGvg;#>%Ov8-Wo+2o X7G./r\#-}ԭw ,R[jVpiupաbڼ3hm޶tLeKЁұ֬'^%l"#8u/g96ɍ"D;lϔ6R1!8ja<\W6o yDG=pA%0srb !Ri*1`-bbc` eA!h 1u2_oBoQ ݆`n°7vx|]zIq]ylzͳma8znpE%ׅ^]^;fY+47pa1JAfu5_PKAХG֣TϣR)$`G&59͠h4&LrPXv5tHPT]"]􉑑CВJbt:\{r.Zkp0 H0з/c`FjrFLH:)7SWhssnS_ ÇOeB0f ҸbeuDR(C2(,& 0<^qMލ -w`Dg' qfaHBS JV9YKƠ*)Ed['E ·gU4+*wE8xUd#dHrz7|1pł:HJP2zE8b\*,}Ԗ^4) j0DȘMȘOWɆVƾX3c!+XxԭysҌ75;2OL\"w7 *3w`IYT^S1F;Ij e0#CJbh[*LC2'ZK(IItY$ Ӂi]IhLSAlTP.v萠j#hH)JKFD(i > g>zaG8f`!fXюOFr$R_#AF5&3fg7F,' b/"̈( "Dܪ`b`JPoPk|j#'> aP 8giQHqb4mpƂ|`4 9 cX2#b6q:xpq2+XpJld_\TqQ\,mVID^*fV;vƅC&DGlǂSlTP퇇GUxBx8˶ ;g~$Gq7m]Eބa? >d'*d#8$;1VYE 64wFm)D[&%!܁a*eygg^g}x<Mŵ_Ǐ}>grMXbY>e^rj OQ1y) o5BSୗA52i"մa*R!g>ru/)~8$/?tWYaRŜNE-@$ylpp >H37_6.UkWu?|?ELpmm>6m\p$ϧ,3-28EŸz-E}=k:a.j=+ޛk3`8zvfۿj}}Epyk5a;VP)UwVe\~)+ +J'JqzWKWY#$J,°S'mIӢ8G(Nt⺝3Ox0+G**2Mka!2[3aJfØTλݡ%7H̺ H+СR%PЂ qDn6igl,toLѥ^1qd@yig)/mmomSMKiFwE@4D+'H 8 EQ a% udI4V$1g- Ksg^l.I 1JfwIdEù!H!i}q_+ C~L4ۡXB 늰n`;&UWťK=6\"vJ $ 6s- z\BjiDZ RYh$(/I:Ò8r/H 6òDq%L 1hM6B$=%Y"*tm (|h|0sx><_urBI4~,=-%}N4< 8/:H2&k< NqiՙEO{r]7:Q2LC?Wa?$՝\3)pđ@qL]FKe8rI:ɔTQz~m?j_`=w WYUoŴ\+ mNEN ѴϹ.X9ύS.\F;i{kVݣ`eG(lr5x:hfˎ]߯h8+iA.Oǫ3CuGŵF6.Ϯ\xs5=v0OE$}pl6ތsG?_MRoC;L쮮mɘ֖@o鼭ڌUVyO"h'2e׋ܴ98L蜕UnuɶV[JEIq暈?Ɨjq8G^ /mLAXR~~fLB^m|I' 2Txx1_"*_PT{8Y* .HHB*~|W~swOӷ{7eBkBB3 ޅ r5Ϸ?o޴]5M-]z>kl]niw--vylTeb#A+;Pt'|qkA|h]vğdxI"RP\Z7.YpR̢KRbXZM-H{ R7>9 aϫ m9J!H:ItN)TdGbo!h*5Kt:әžd'BCi*wΞ,7[Cyt1 AZ=M΍ex}G<1 u$g}h.j%E̊GKF3=ٿ$z2>gEQYd /]6;L@^Ɍ9 ,耳4`]XiCcJL@$KQS-tEWb<= F1d[ctTI{pt,(s)23 uv%e$E\ +cUVJyK㳐> ,jit=Msg S}](>)/|92i1W̓"a\"EUQo:}K*} y2lx1j1}ad@c"Wd 0R'݌ͽ=]fo m[cm-A;y܁݁IY`\LQIh⋲Hul@V|5 8#Y;wYUAdaJdNI]tKQZmyp!(z|7L U*=?aa2w3r7j}JNe2.=zBy1}ו+\?,Ctwl`o<#@2dO#A/E2hEk@fҗMP]luImr Fob5Y57Y`zV]i'eSc_]|kJy~m>QY{]b{$"|ĸY>om BF$Gj}ѱ[V\uV-뻓Yn2R<*GYUS,iE :O 7Zh:4vX?vkÛ@{<Loهm}{Kwnq߇ˎk-}YKZ\kemŶҒBDe'L>+ nAJ!v2(l;ϾWQy@c?vU)\C`: U o a7]:h, UNi$}Sd:g ˈ`JZFCn}Pa{5d^/2}T̂2}Aa[E=2 M=~Ĥ+{υL`ȵ5OېWQgW S#m*8uUȵPQ+wuU\Yzur\bzӤۛSsVyN+dcI.ӛ-&L@U2&fSۿUHIs?jka*'OnqJY,!dčYˆF6Afu3Q5d΀}Μ_ I/}K*/G;@q1? 0/-7^;~͒A6SDkU9gVV5o*m7 ;һE4F~fvf4rm4,AX[•2Nl&yq9Z^xƝLL 9wNkpM\c6Qi*5KI,GnS18- 2њee;Y勪R댜 3>)=IFҾև#t!bh27I`4,Q#ӭ&`ƊRklCR%}V|ɉCOBbTJUư,󸠦M޻U8#פޗqV5n[%׌[t,Ү-YIޙs7ONWN}#o*~$$"sENmnh sQ0})$FjSj4x M&lGDf69̂wwEۏGqb;Ek^kCo Ṿ9b~w'QیLd)Y|h騕$8ahQOӬN@sL^4wVhi9k*zSsYTXNE^ lv ` 7Lf,w6:,0#M)n]XiCcJL@eE(QwF1JëIۼzPӾDWns{e, 4$Rص,.?\ۛ(-%鞿zW4ʁr2cPJ| :iDHǂ[oٻ6r$W;m/C6 v&7b\`,K9bwKlՑ۲t8"lV)T8xETq!a劤&y$5FҎY | \KpiF=iy%.54*MrQ`Pm5D;16j yA5-`_A0Mu}1c42\*밊a,%V5S*GȂ*d4&jci7[谤Yʉۺf| ru2F{GuJo)gyri}ؓ,sȏh]{u{htB-V:Ÿg3?x va:<-Y[ȉR4gKr'<!lWvڭ{=;ёH$pjka)~AsBT%4Gy(ZaPw{QsL6 ȹKd( X+\$ʒGHD#HWlS!qJإ{;031#KOŹh; 1ٚug"C'>R^}fy0jG@A@n$o3J&_Ho;42JЯ/j7 P0vH #Z{1[,} *Ě QS,Z,rS,끥xKoC65gWga͡\Z&`E𡌡=ʲo V{{ףW_j[hok06Û;2>l[x(y,Uhqxr/k9Vvb,%j^l{DHK)ģ!]g=uIգNgsZ&ĵ r B($W1po :#,RFXp4kuθ2B5Ikyݳt9T?R(zyP]=lЮ C'zZR7 45.S+΃GW`B^)#"b1h#2&"kgⴌbYlN]jrZ.;|C6Z}r~V׼}:1;#O&bpD4! VSr"I82cKesh5Z`(3*h2:ڀS b0XJBԀREfLg=M&T|l;TmA٧6޳|R"i Y(|:20 $ B P\|K( <k,]n0[Ix$Xo%pLouiQ#uuL#K|PJ`'SV -ݷK֐yCZy.be%K-lGR*{D5{UYwTAHAdc"8h vERA(Ŝ:.3q+u71ʜcS31hB[vƟ7W{U\s]YJO仂㐮&m]p-ŚvqmV9xJ/QXőHOȱ1H=hj:,yvAlS>$BBc"ø#6jiM{vk܋WGAkY\¢M8J* tP~[Šq34ذh8R3|û<YJoA1G $˴gr8ESV] c_f+ҮO8V -aJA)Eb&(t^?[J|;EmE_)`Lj"n0nCwROO yci9Lbf`yΦ4#e^CǃtQ #צ4V:"JJ?/a8KYGg'70*0)[Q]:5oE{}*0+V0pGӳjdٕ͑l<z9/ɷQfƖll-]45Cф76:U>0 F0bZ'b6=T7J.:jC_A .[-~XhHji2s"d%1G8T*eD-+-!-PY-_/EŅUK~0-Ci}_`&*5ǰ\]E)տJ=]0SUuq8gn8=% ȓwo|.?ͻ'{Xe\ W"Nxk5iho4ۛTtz[+u]nhЖe—&DOXڑftPDᾬ^t>2ZGX|$;xDaֵКr3L,JUk6X&>Ҿ7%/!m8, qL !-wLazsMʹ Ay3Nt0JNIn:Jr{ N/YaU4a;DrF:bPF;wq*TZ(u9OD3si', P>f)aJĶ}n<ôhb>ĤJh=_ru!1%3&Otk239 [_K-'A«WT*)>^F!|k̮"F>&03Y\ĭn6i|LJIJ0)JHf^ DAJf[XYt^_ϼ:ܣW)`}SMZ]wV /o&QmMG_YUq2=ڢŋZ<ͪil\T~9ˊ^%KA}Kf(٘]O.gKQ#1Lׄk߯K/rI9]G*8(3od9F*H,/i;$ED: (Xi ߉̱("^`NɆ2$6FM8L28խ3ߓtH71t(w!Qfʟt{(i9nl*Em22bPJKἷb\qozA neTG&kjc0 /Z@[1)T+'}NC,|gY^;kGA mPb{xcD)zn%ٳ(419,h1\IEG4rĈ=bLtm7<l~{hzz,OІxkØp<" FShJ(DeBH"a0Fiޟ&hsI@N*%|2Vw5([WT"Kr(E姓Aof Y&aZ Sw:h_4ki۵֫'WV/[T[V20W^U$T?nHMr>e'oJݪw),\Tkʵl'o>z b7ՋqzMu#Uo7eY樜L?"p~vif  ;tf~ʆӔ& |0Х1эwy687wҞB kn''1fe},BqwO"!&hƸ)~L'iEp1 T#>r_eJݿg` vg,Bnڊ-| 2wK/=]Lٻ޶r$W6c>/b=4vw02FOǰ#%;3ūĒ+ $qRtuxX4CmYgiec (Miwaz,;dLfe9}6L2bS׵d!jx3fWo;vdvbNdIk|RygF*e4 ɢ=fy]8'=ү/JQY7<=ch9'@[EOk%)䳇ST1tnu@IZ;Uyxi?ey {UGCduޟԒ9(QԿ݁=MG.Pb ƈr+#*Ė.F_zQ^"|VBCZdoqsfd&P-W@R2YQoR\r-`33\謏@FQVT"W$>j7>xzNՖMϥmX"Ӕ\/M,?=l9kV>$0$U-UUh>7󋼳t 7!$\p `!#l&*s;aiyh+!E 5.Ku2UiM"Lse7> .*j*;9BRRKxo_uϛkhc:'(b*ˀKWW}DPLG#_=_Ao3m`0q=u02aPJ0IJD2MUT/Hpb¿"0cX\|<g4ʧhQ]wnkj[U$UIJ(+"gI:!!Fu)N+BU8^e`)D=J D>sXƴ)-%lx5qUC{_NR$mEe=&ΆzWf5 dx](2bx62-hcCRS-j籩@MZ 7dS qΌG :SK'{ŝ @C "FQpsf'-߷j@PkV"K,MU[㵍4)V3PyT&UWin fA4O3MiW |^)@YT2ɲD:J#_ _8۴Ge'RoDO|ՖȮ'++SkFE|Af //)F^9 [ qWVeZb4b(_8~2T?)+&f|Z~+eqRvט|5WÜ '%f@A$STd]Bggv@|lM3bt?({"*|Κ>$`N|ӗx-?wp|va u6յon/_0uss;N{`_~LSߢ#d fWA~!x qy)GWE`-/~\L^4|=p%6zR JXQy`N\q`v@,>C]^2 QK48G  1O,Gc14ze=1ij)CQ9 > .8L2~H!('1y]^\JJfn)kJHp18S90Rhuf"GwV8+*)e5qbk Û]WGvoxQsn|uTu4>|A[U+b&7#)>,$x>l޼MJ'>;7rhQLG\Bw]z4w( pF[w/ɴu'Othtڷu- $պy^w=7~u z^j~=G5͟wߏy5?/=NDѥ!^c+x,R $'(S26#] ۝\/n=f92Hp%TGEZ[eH.,u ^^,g2F9 FJ:2Lb<;`.>&f9bxJ{VP'R8FݐI> ;JoHC_NWYnh7(sC e4IPG 1en[EygF+,7yl`Px]i=bgmcq2GVM".cWUYF,}o0K\\u~'M`я,o)fwr  y0Ιx·?\5>pIfdP*3.b9|Lt-#=iyT[ O L{J{3}eԊ086? &3|-vQo=~'hbH''N@H@1d98UKKy!_a3CNYIp/==Xm E^Kϔ!<g.fR}czet)Imfv[L'V p@$fdK٦sE݁,fYF:FL PK7K}a !> !Ψ *~q.$5d41Z¨IZ/p@9CSπb&/n> nδlϹU9.z`I{k}27<|PZQ\p# "z6h|Vh O2lN=s!eeUf$HUƹLmjl=.uRNvn4׏ gfn}tNDgX>m<Bէ;,C,qu3'K##qә"J%x"ywR6DxD(ko 0%`PM<BgN[G5FxYsr&!.2s5AX**C}@;oA)4A&>lpft^{;WVKTnz~D-4Y h..ryM:0FR%PTXR4<͇Jiڐ'YǣV!X 3l@k@p-PjUHuaH*5䩂 ~IwXi뿸\Qh`,K>I>Ճ%Y++2e Ѯ.g7:i/ӓ ._aG뤓qnse RZ : l 3 fb/Qp0Hl?w4]2t?'i\4:!J&a$,nI55(Ԣp^4hwlF?w v؟N,S3Ahwyd檭xmTGb܌ϭv{~E.^{Pd.{~;o1ؓ4Afwz`M\4slheŹBftњC5lSnl7 BVV\d\T:k + MA>4I.R$x>iY m O+\/ST'GDI4| tU?Z+I]r{2F/@.VU.67*'~DLVVa* G)3='udD@&Y*u:6h/~VbJ^}FD1RL%rVCTp(R>hѲuU?VTJkm%ꔨ2pYWTF) *Y™$L%Zp4RSu` 'HDXATg$.ɵ?fVƳXۧ9i<++zUWhA:y(U1^}%*XEJwR=htVl=lP n!+Uf*6$ϛ.V3p΅eYYYYM=^XyP tcJ )űR#t8p eNFDvV0ǤGOp\ۥ&ܫJOϠL{.T]q\WraT!~2tJB LJʋi*pme"UTBYrJH2jNYtE _!LB1 5X֌REDFtk!Q4 4q`wE O.K¬W8\L }%Ҟ*]/f0A^@ȾxDN/Wm9VQءWBGF0PGCGqy6HڧKb_ޛͥ8?h=9$Wa=@B@hhK9 ޖVo_/J\6.lC2{S//whC5OzoC~wsǶͿWݰ!3˾<WsI >{QO`et N\J;H:SW , L"Y\J,mf)"`YN+ga69>={> &ӻ._NB}A/1R}zSP1i;eZCړ~, sJȄCo)@@YV)ar[)Zʛ[:Fo3T N@3ljT-b왫di%N҂$jcL gL"Z1Qx.9K* 2𞲊цԀd| 8.i-Dy㆛D)9o$4qN-XІ'.yR P&yt @!E{]&"eR!pDXGN[i{X; lb͏C-*lUg; (@L!GwIABIq:yCJ 4YR3= =zLh ϜysJEɂҼMbPbgwx)EbޑO4 hFB*O2J(#cC߱ǰmfG[!f`ž1=ROF6]# K~dm!(O[w} !erUBZ 2,xFEDf*OiR噐mN @2ix-k-[\tMj%EC2H6bJ20 Q*J8hâTb|eNr^Ʋ /x`<*2%OtMk^ ހTFDY?|VK+R}'T8;㲻&뷫L<'JxK URn#ZOy%PV** $JF|7_eO/4Hd4AsT)K["@mNE )V:E*H( OZDms("W 1BNp@yWX`!Ih5s|̂3wlX;=N{YFZ2Q Δ]^ShaD|mF{9$ ;!'ZqN?IvMIj~mVJ8j=!ѡ_e qq=>?o9,>4!'tWǧN}f~m/JuvPbQ BkANɉ&p:a"ˁoN\挒#cBꭍh~wov{/+?N/_f1\,4_\Ξ,7Ov%8?NkR8v BzҺ'!t k0T*pT'|)U|4/ucp^bM6ukNU:qАܱhsӷ^6FFe5SoyL]:dV#0ک^X|h4Oz~D-K%Ȏ* ?]%*ɟDx??;?~݇?;\e}fiA{=bH ="I9ta"rf%lԄQAM|6,KUkm 8, B۫ 5S|7 ɭ$b[:Ꜧރ  %u{:EtUԜ8( ue{1ercƠsXեNڎqv,ռӆ %DKTWW%qwQIB4 ȣZû/P|KQ$EҀCIf(3,bADGIYI2 J<\3AYm6Nb z|@>?@- Wܽ i -mN ZkDĆYTTNL -2pg&*Дh s%= HS8O$Yc30QZ(NR]by -E|vl B Uv#R9& ?"M&bI-@% 5@F ,K(%eY[|QJ傢&-H")!GG5sN_&<¡b< 6P8j΁W7ӧvT |s|/cGrTw"Aׁ>p+Ecp`XMP s&mO#<Bwڻu"-vWWӷdMӮ+񻁝o]]l;JH &^)(.'Rh{],%']]o`L~4G; )W͉ +N4ERD9U"US9LԩU9'֛ -6*Jn]M:"$74Os ˸gWfa׼&}8ϳO2,h3_֮?UOtL_3QY][Z̀C:#r+m#IKOč<" `zwŵDEG/od(-$J*-U"#SؠM' H`#%Cǻ^ɻwJ{:;:uU(zR"a{%oc 2JtH&e륣ʚRƧibPr#iSt. 1KlZe%&a@D:ΖI'd.%ky?>~w6s3hbj`v|ͻthO>#-HV[VbV*K+e!%!*t6 #y$iJfy3hL詄,Cd/kh-xVEDuz ^g==>eSڣկgX+zL1ٌ+zd@sd9k uu;c®N)gzcصǮ=vk]{cصǮ=vaJB e>p*E))(}C*EVz+4`Jb30;f`vũBqg3@WTDS&vM=;QZȉ3Q̎˜愰$O-ezϼBe C[Y$DMd\2K*)gP1 #$tKFt]0J*rGu+ErH'YEr:V TLJ^DwKN(`#9`2; 6l'`I+RtТK g̨CHZ c &S6G{ſhE-m]b*ȁq`a'Kʞu7}y/*Cy4m)ۥ}+ }iSoȜ_Ok˗ӣj*/vN#\ɇ37Li~2݃5?!ͮ^xu:=yrg3GpTf3 t6#x-9o9ynwW׶ƖYѦffH͜)ƭ}XWb'gm+G@Vꆱn/lu^Аڰa$GN`\h=9.ںø4Cc~=j~Nf3 .6\gq°.CUGi_Of%A?տ/J+jYR͆q+l"?y/oW~Ry/x4ZifC [w_p\joݾiR]5-.G=-u7׋Ԗe(&1YڑAR O7DzSw|eJ|K|Fw{V=s!CkKbeY9&V Te =F`¸4KAn7aaI"ՓMVڀƠag )HF{Q2&$=xYG ՁW*xC`F)u(" O;GA[L9# ##OgQ Iɮnu՝띾:㮒S?w"1xH9IQylA{ʁ$I"s=@.m Qv))QRC )4xJR~ EAh,/۳ 9o>[^ U!7a5&TG&I#x_HIyMqjWxy]7]B*F_j%S"z~K ٱ\y%ult$6{еg|ook\ Á Gc$w~4Z=[kw1g|8PwOIy>v,ژP$(Dh >F FM}C hs+n=@)W*1:̌^2J!#IF`L2҇XԦ%#O=bLpV]&2BA"ƢBpW9Kt֝E+K7qHWdOvj\Lx]G Tqb"#BjZHw|jhyptxԞl%z+KͽV3U҅֊LoӲƓ7ӻO͗T*2EFҍZ4(s q.y QX\د GmMy!Oyۿxo. K:trו|wtfvmpc:4`'@^Xs8 r!i8+D;$'DwKqj!h.~JыQ+-J}j ڻ\WHs28B_Q9πvJ? 3xOle$֊%NO 7x3CMmm;^sڶlt+UkڕгԌxyf+ܗfN[d=n&R]$A<"5a6SijB_RTZmΑ~~Yxhy&/+_K>YDlJ0_͠n(4QhF$( eƽZb JV`u,h ] Q_Ɍ/ /,ҚX}]"` )i{y[+߽CͻѧhC} bja_p75_XyE~xxmҳ}ǥ|ek幅f>̬ŝyZa^xYmٯ>D&h]6|L.SFe.`5EVe콮'u]Gpu3k(&&o(,U(YKFHIHkZZ% Yi'6I@;QwG^xzY+|λv[@Iw>#WoxByJaø6:d ldxw=&ns;w谫 1 J2JbE1dLtL騲)cO{LOTޱy'.l÷mw|` <(ƟNflŀt1@{ǜ"/R,ՖU(JY-`I@ kRȠ%y$M ]cXI:}Xr#!ƹ4>=Pj?yy;C+6Įm2*N̋)PdA : bw*-` AzYwԳ#c@]!|yWjC  p҄H}bt*+W$ VhV+e:bm **Х |~*?7[釮c@묔~BQk->[MԢ}l6EHw:sI=Nb@ LLyլIf˦*6$%-XLN Nqv&Ou pf&BDEYD!kRY " C,|jS.dېE\Ak1b0n!;YB&vS ."iQ2it :[*YP)KVBW2e=!I7$l DCwXl$//,2~XLvm-9ڎ˒dn$˔]jWb"%c];d(IW.aR| $PaԐA6kJjX,fA^r=1f'i #p|B ҙ&e#袊 "MUyiRr)i2,kJ$p18S40IhM2GwV:.)e58;|/"a9Oo&]z/k_n/nJl֞;txp;˦ݺ1YDrr޽ռsۓ\w r#$|Jlsu> ? RZ_Z7֛9mv?OZ6ԲYݻxΗ;rx(ú[}ywq >p-/C 8e]o8׸_5nMͫ݌v*arl7ͭdp'\OWSV 46lќ CUq_R^;v9vxr`ʫ\*Em-Z$ԥP{Y&fթ hkL ^Z<2]5bc] (,)Rd3;2rH1Hpj~T<%NNCu_7D;*FwH|>4Oh|ںm&>^/+vZV rCTDKitz"0k]bxA: 3L*w޾ݴTGz-@ߞз}6MyLhR%&|HS*A.)g~' Dg6a22xI74lS s»gg_͟p:Hӣ̪TϠN$MAZ] ΖNAFôJ6٘1?[h>Y{FLŹl0h<^W6^}:o߭W}?XjL>}=ʲˊF7w?)_ɜ&(tRI(Sj`0B}K7{"͔ɞ8cs Qzy599t`IISŠ}*c}@7'ݞU^Hɇ&{kZlwoFTǖx%%+\`r]I5d"Ӣ *&*12K@XͤH"fU擫r 0m/heFy:\zx51!LIv׷gYw`7>sфaFu7Nݛwݯ=Vk/uZ~p ?V ؉KS?<sչ5_E~9߭Mùo}wK|*h`q:jӟ|~%䑫dMȟ\}:= xίN _N$fY$wPwϰ? .ޟ U ?^x I#fXyɯ?~ {һ_]]_k2!K=),H<nJNox03vg47׹bK+9zOˆLvJCڝΨMm.{,+l1Le9}nwLŲcʚq_@vkݭY[D?l^x# ,J)H}ygE&g2ɒ=rîKYϼ%T騬q; ~WCdA圸ɞA+">n?cbdIQ{e0ypCze|z^WN//Q,ua=2JV/doZ'B?=ͧE5ԧ_Ch7ڡm@iX\&mn%y{z{Q^"|@FfoS grUwG+tDHZ%5G]NjmvF%,ѡU#]8s/TvUĭg'dYr~yjk כmX<͛_[X^4T!ZfPܗ?ZA^ՊW~ %5=Ydz|зl xU9+{RAVpa}=TIqnK;f FLb0RE]IPx'#@:8ɒnwT(1ܥRE H6wHOJtLBYC+w9ZvZpvLCqǦwuVJ)6Q,` d|92CIBrUFD]C3y a`A{c< Zg%2yd$DT0g]N 7q&K&0 5&m+!$dL^e#"Jzm"gNe8LL2|mIu#ý\1tZqjӂ4I8aU_SooZʫɣ,Yox^muN5ݙ;qlx̰1Fs_n>omԴ ҧ$yEM58;>"Ҷⲭ[U}dtCJU&VZd_/\+t7֫%u.:zu ::tL@u;|j{Jb@f>!l4X7Y$рdFA'+A$D$um(SzveXJ!}RSX/)QF]_j칙z&ё Cog>5}|[Bit1)G7giYӸUx>g&_r^YC%/ XI}rŃ &;GiYRY2.3) ̌F)s JӘsF#*ԦDMj̀lCR%}V|ɉCϋo, k7=ZpOERfJVtuu/Rqo:I:|zux+.ͩ,iL[`.f2l{jSVC#ԗKȪCFJH3 XΘ)$o5ǘ%Xt:8.R:YĴFc3TrY.zR(&fHӕ<<\Kǹj3c58ۙVƅFƾ\*s! \xwf矋2'M۹ݝ~v; KA..&7篟E,)0V$ CT Fjnd ;IF8ꖽ(3c(ƞI66Add`XC,lgl?57FG_X[ =(M#@LK/GsIxKVdCceHu9Ș#ê|&ѣ@2)CdEI"&(&&HEp>fj2;]e>g;F*m!ƾ0b/#ʌFqs #YNZF4h#=g%bv&88@+&fCmֆ2rOJNDD& ,RA>Ϭg̳D/V:YmˋX/nNl5KD#(y Ftt`N[ 9$Dvx>6:‡¶TZTjȭGq ~nNp8k[a5SG]8gײcq&;kR_6ImJ9%ypޖv-jUY("]3 sGDWU X誠պtUP=M5j͂]J2"av>iP{iŌP~'FRS*Qk 5ےF*h'} qߍ1x: _;=ҫ4.pڙ)e>>k3VѾޛ |0aauBͮnong#WO v[=U k;ܽ0:ku(~O=ͷ7=nN !Xmg%J,r,N,f6=CWXLIh,"ϗ_\<vt]3N'QXkJbpQ[+BJpZkG6eca(j#??AYLܠf'S >wj+7e=n0"y=nxNn=}ol]ṃB^&v?Eo*nYA*rͻ7}idt0m;ڌ }QgChk=,=-҉^ |,vK h,`|ש:ɭoϪw,.ڍ GhS%Nrj/H[=dlScdOu MWb#ѱ!Oߌ 5nosL~DogCn1;rnj]J'-m|*̺cyC6iĖnoz.izDA )tM)8G׼]a!|8u_(ٹH8ܛ2=ueWڏ<ƒo%"\_|mu܋m>6`N?'S.|@+醻%P]+^n:LMQ{t|Y;PȃFj-d -@0fY>khjvsZJrbZ8Mk 8 Xj q*i =E\ɴ+,W7:5w\J NW;!\h$H+UOR%|+?tHoW,7VpJʹUE NWV0#+!hW,טVpjWr/\pu:rJi+6R7+~bqERt31j#*2~ʋ!Xv \jf 3JW't!\ARdvVkq*26= m)?%ػc'7y(Ai}J34WЦ:/? )jW,Vpj;ZpuNy7v6? ϥ?#%w6r13? <DŽn;)19&͜c2bP;d\;a"tvob9aT#XgiQfi93x;Z\V܇Xv[^ehA'lWl;b\Aabjyqs2&Ha Xgrno\ UCp=.uBQ+bvoX%-oNW]C`#e3b+Vkg]ʽ\(ƮX7'D+baJ'\"|p2pʵ3݌fo@nbPI*O%\ \Ltj짛A2[UL 7#}>8GSk4jJ;*UXpupӳ,v-Z4huFc[ #ik,V0j;RLӯiyE8q14yFXT3'>D夯fNղ~ΥexͨZ`iΆgVn 5bCk2UrIY-ج2%>H!DK`U3\-m+Z#q*]pupA+e+bZ5F{/:\9I^pAfpr^?bfIOWħȆpV3=fpj;X_SĕW&1\\MkKq*\pu- {k r]3 Ji$plOR'WH"MZt]V׬׼]K mtE̙wo}wQ?¨E{3E\Ŗr({U=K6 oo*ǻ.oןv_< \_ߣp7b}{wuc]HN?nD+#u>lmcpkՀ'{oo?0_t-묻}ٽ}.{WҬO0Ň|ۛw_#^u={6(}8,gTG@QAH~S>Zz< @8jqjMODc>=?>Dt;OW^kۤ(u]O's&֐QYwB稵y]E%R0NTeNfg;tYFmO?_~Ѱ{o}?|s?}c۟F1PW7 D(@6ܹL]"6$RVB^;\2 %,6x^s|_b2W-JH$Q2fE,MYe)TT.Zui:|vUJV+0Ӑd:r)j' g{d(D02ogH2G QjMR(oܔjX+,TyᓗւVҩd4FTP+2v)MrjQ^XCW?~DHgK]gm.IiGR 2EuB:!LUXB@ii{,u-=%8Όfr֌ԍNYu]I4dQ}rQ8*oϐEK FZz, uH)I(6lGxFiX*)uh*F)yrECWYKyr(r(Yc^Z3> gHx0b@h7?ֻ님,im+U@RTA@ ^zRuQ(zQFN*(]4@ύQ9Xp]P֬Sp! QcH;*ޥ$oD$˄OPFhtXo R" +]QBED'+b Z1v!sG*I#5ѤZ,舛uRpą5 -Cօ 3˺ H L%lb%8imҥ$xQPA2, -޻ZAHT:B]T9}iJ ܂Vosg\,U+Vi2<viB+4pu22L)ȠL* e"MrB/ edȏ|K0 lx _nݫb]"u&9i|1F *PIdНD"&`#5aV90a]Y:^}C. :JFG'ɻ@DlaHeƫKdl@>;@E^!]t>j-*6qLAO~|I"N ҃ZRD( I$>x^ː>6>)S+ =]km4-x"y$H ՗TDe#ՙ+7#bT񧯉EVGF*~x}%*U U;+Ql+ɒA^VNUSaݍ?~%dB9 Aڱ ,A%DEHm֐h T3ڄbڮNwZX<83   tdiB\P]`(AIb$#ET(m] CV@;8 rZXd*TN$E8*gdO'r,d ۮd? *4MYJD[v Ԕb:i[Fd^w;BZFn$,BG [5/tВV SLЍ![jM csIXr]gTP=`e cUڂDe;!>";Gɰ%6vق 'uV+ Tb,4ȓng"g0Œ"oQ^0bp0;JQZcDԦ"d$fyawzށ U l+~x.$mDr1(j-r hNmU;3fn+nFw -rzPJQJ>*3^& &c@-B5gB'еnӒ^uTJBTT>@MtkA ܭUyE  V@Px Vp* -]Zk]BL i M r؂Y2'ʃ WM:CX6T\ₑ4 t ^#A.2 w66kpUۙ\"AC,fY 5I8 I i7u!҇^pJN-U\3y[)O]gF 3%8Fh N V/mz+`7*rk%Uu t*i~-ӗ}Awq՛Nfט&!{Q1w= |41~ vyW hzvN)~]ۋhݥ'?Idl3jv75sx|BKr|~.?#m+_bI-^Ť{q}ߺ?_B߮Ft>}FƏmWh`~e~~_.!=?xv8N j0N G~Bm)tZ@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; tN b@N "_i('q0#ʱ@Š N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@:]'܄!9 TphL@r)N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b':_5v!ph?z'Z$@QF% N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@:'[t~X~=eF[MzC߹ֲ\lԴC2.E{8%a0ƥk+޸ڱqb\2~CoP!J?5] 8Ƨ"{ROBW{tʈ\ !n-NںF( #fǣo/|=0eFŽ =w~oFj}*_o#񵑐cF9-·~pݿQi 1CiB4 iiZ #l]n0tEp ]Ptute]rr0tEp ]PjS++EC+B5GOWr+ee:rD?ڕVz(tEh'2r(Őj0tEp` (`uutEH=$u nTC+ ;]JtutC68 `뇓 r(tEh׮LWzy&P_C`Z]<GvgPAWC/a@t_|k>nptE(gLW'DW!Fax7Z:GO;w߿nWɴ}LsrN&y5+gy=0Aqn72I$/o勚_4hw\7%Q +uOMG07u7t^^_>W D߼&'ܼkv;q7%um88זiG!}3z`hmcRtMLke+ITzgpȾbe]-$HYި0{TohȵUziVi9GP<3x&K3QZDr]:Lm*`ی7U1YĦml:Eϖ1q(rn9oen[jUѢ4[T5k~hPi;7 z(z{p^Z +XZFAUBXj3"v0tEh]`CW7%LW'HW82"n8b}fF;]ʭ`:r8d+P h8zuE(c:A[?R;;3uEp` (LW'HW!9(uZ Np ]ZcuutCt0uEpc ]PG_"F0]aJsՙ"'ԞS׮kX]?Qj?JAWCٺsuL.HHnמAHm~"ciUiB 6Nӄrk$д.0 "ABPBWF +p +u* ;$PtutevfHtWɡ5Gc:ErۨDWŷDb &;]JNaP+sGځvGJWRq*(]QWF]2HtutuRCoFpg8 5GP:N8t9L( '{J<1]W?sC~({ЕfzK+CWkz~h8v"$*1 Z)MX;3֘Tp;w߿JF# {Ju|/odEꜽ:̫s=wl~o8x1Jyk<e, г۱ꗏ]& تYL1W +J9Fu^UcI1hyKg?ѸmzY? bb>.F-⇮̻m޷6ۇaʂipjEYCnLI ]z[+oϟ}w9'oC>?;#S"6 Eh>PF!uU-\ch=X-"7keět:w P YU2<ϟcW IAi)F.q!:ZmMgwo[y?.mϮon> +w鯟Wu͙'?KX?.u3rhDnHAeS?]Ϻu7T*4f>>?/ByYkĠT]U!.x*k*]1H&h AlmـlJ|emJ}u>ϗ+s&L :ˋH묲TTJm<.mٴ)$ЀՕlRE1wlPPи?5K[qM7_\x>\s q?=wyWf3_gye{I~LM;/d,.jr;Wc$)O-GnSTPeQ\ImCNmZtHSX]/;m2GRƈZSA mCW0Q1U;]p Gܙ'1M1z5ib΃+MR[yin3C!LWuvJē+M6CWur2K7}o(JK!raXg[i7es6דap~yOwP{w]wf@W? ~f ?Ku)k4jeS[|bw6P_:[MVo6Em}&ƣhb\R@)&G7wշR,tVIv{xƊ8>40`K-u5zl5k^oJl݃JGUݳ]մvZQF{% `TNΗR&}TT٦!e-sb2AmaY% %$*=9ϋ hr>DqK8/wlA,-'nLyL}ϫ8Gϗ5/ zgr|͎;,5}w+tCoFlG.' Ϊe(_v HvʢŽgGVGO[8:zGucY5 z؜Tu5׮p:+ Vv7 l6 kK\ ؎d쵽N6@GZ g8Y;T&Es_ oL9t(4s Tfʁ37$-5nZj`k 8uT:BTF[w6t7sxuy>Xv\}CZ&cR V].Vsel))qPOг|ahn YqJh^iThoQk +FD)hBm=*=h3.,Rb)&E ɂIcJb 1Q_s5}g&=3tCʅ s~QP3Z1dM)T, h:4L.ӯ|*#PC.߸D!-6'T&{u[簡r%|OCIMOp<>K5yݓ#do.:Zoo؁+Rw_lYW>̘[n^,m6RQ-c&7-u!kE4ْe.^=#5%#aA)295$-2qF_Mp6VqR}{9*4㡾 ^Imuѐn6[n=ۅsBgt=0z`hpPQ +}U4]L^GSJېVC#{2)i43ڄp;j Xs*{c&asF1ڗv kk.f&`BPk8ՌX|䃩W̩fÕC!XaȂ FʨD䃘 f!B;c7q5o;pΧ\cGM?gG=YH{Z942 'MNs>G6,:]Jz7BE\m~&|B41r&p6x={Ĺ~":_\v[R7-y_mcBTHZ' 8+XD0m-hI/C0.>k-?-Wt=Ge"'rFGܘBُFYF3r~|=U Zjꪸ&]МQɼ\4V{@(%Pv"D]Rm !Na5@@tHwWtM<=L:FyBɺvkt1Y־*.AV g T,_u˳e˫5m_bzb䜵Ǣw}zˁ<(Q q8~=n흗e^6ՎBb>ϜBbYxπ:L+at$@B@`ŭQFZC5iXF b썘T]2Qb"Al6嶈Dԯ!=$XQLؒFM+K6%Έ $kHqQ^`9hv'\;wТ޿i~iMSoںctq޼9^(ӉAުlys xy[7֔Hnr!I(9F;?k|1b%VOV}#yw$vtowaOb\拵'؞b٧{Cgܿ 7څ-@,~t^xYw]_:*{#N~t~OX9)5:+Y{b !pP Qb#Gz`ås!fnh'`q1`y7E[gbt`rv R3BTQ쥎k:,l( LЪ?RCWvzoUc 4*ZYwm:x-vѐ6)~vMK7Oe9*m/xKTX\ ]l 2uth8yCxޡZޡ>Ho"AbgJՠN&VkuC`NQ( xr*a*ꗣfUG *+oNCd> YFR,3VN3XXC7qQZh :[̦y7dbdsut*;6)$V(h9s%-e@b(rm+,&&W\ S&eV1&=,)>:ڿގ S@5BO:!X*$K|]8VMav5ԬoƢm9>!1:Đ,Vj9P+-9_Zx6Gml@666chjbw0h\x $ ~ӳUR좼llNzIVi"٨RbҝS(=[yL:|ƾ܍}j_ *LHGӚZe NQf(F II 쐬뷾sYG|QPJu.ZrUZA@Dꔼ2U MpR V{Kwx<[.rGx}lס=f vXjLyYbޅ{xJGNx7v"́]Ս)6P< wӗ@a⑻(53I^M)bL! OKu벫Rhe~譫TR^XA VXUyٵ7qvb}Ҫyxg_B|ymnzB7w99O3?}Fn*1[1bSPD]j 4T"MZ>[A9\Mb0#PST)k%B+2(FUdo8+bU%^{ gޣ+nNiWe{g_ݵp,lE B/PV<n dtqֽFs.cN^tpK斀k [G.+k5ىHX9d kŅRZ b)5r˯[>[}!Ӽ/0d;%=78|>zYQ\[cbeڮhrU[H sPIqNHQ֗ep ~ĐL9F{MF|;1Eܬ2q׎t b4uE@ Xj땲a79˚ޏ`^QH{55"HFdkD6)kD~5"-z.ힸ(\'q:Iw(?RP&0]w``{tVp锢.΅ oEG+aXEbeub;j-H^g$﭅uXH`L 47d -:lDojlo͇܏܏/Ӻ*n^+*?4=т6)_hGLJo0EYL=-H9-7˽ ɂ߳W &`jBJBeԭS>^,FcP \Oҵ;1/Ps JxBLM0=YoΆ׈EC)41ʃK9$TDD)0xPYmMAi}kIG~Z_j-Th@Ip&ժ+Tݣ.[U=rct jj6'=cF&ˬR PiSWd]R2eF夘t)it8AM'!u|i:Ĩ1+MjK&eH%8Nb(r<e gl{#;U~uN%>7q|0MVX팬ѮjH-Ψ{Ԛ!\ðWjQ_W*WVt;"*S :79t6/XCTE,JkcZ `5 sP6R)ᘉ\,ǽ1V5/fȡCO:7A?󺗊QCH=Rmy 5;C" %6DqVjn'6 y7 ʮFv)nn'ԕVT[WYd([R}rsd)YA " h;زe`PC>{"Ծi("J!ڙ&E<h|&Lj^}Z2KkhrY52F88r;靪5Ll^N&g`b;6ry{zuk;绒_/ՙvv\J~ _Qe}?V$ٍɮM27g޼7ܹwWz|ݎlu4P۴d.ƆOWtz{zq.4zq{?ͮA|'l쾏~-?(oݿ!s~l!< gޤ={HIʹ{sq?5>:dGۦzQ[ZcCDFW hPοxU #i[ALT(KJJQñKQJQkjj( kU -5tKBZZR^M xz#19xlm->|\ |ն5MF`d:xj\"0kMwxAz+,Iz{#tԎ{Gn0x#+Qoطſ9R`ŲIFr%uɧ~C~O7XUWqѴQ1V<ڦMe[*l4%vl4pRNx7zlWu6!^;qhA2u ]o^ u 1LmъX3ϴN̗  K)%quYw8o?DW{g]FcѫVvzͽVTM[[޴tTQAZ4^RB˩%+`4SA\-ogY+:p5ARֆSP./PDgwGVѐ~^%߼Z|݊(zl rsdP` k0|:[O[KyV1*6],<;SN#w\BЍjDI$ttG$cN}C6-v١CωeM] ޸.r"{i <w:ƽVpDb ݏ dL +sLk`FH6Cr8N6Nr89;v,]?}g齾F`/ :W=e{yi 8ht*6.Ervu}M[/q Ŷ[hoVt)hbu~D_e.1ދtÛ/O: C:mC\6o4}8{[$,N?MbةsgnP?^Px](&qnW ӻc= /jjaλб;TQе?pMWOOY :jw.Λp.?sֽlZSy+G daWot<./oO_T]^l~6J~>עǯ !AkZ] *϶Mh;dPjO^f#a}IBiyy~eM=M,anYLOO^{774y%mlvIv1|& [鍽&2N_/Sl=wZ9$b݌[snMf/[l`o,Vzc#ƃĽ:;`I-:A7ѡ?2UFJˮ$Ģۋl'yֲFm䦭kc ۭ1F+n?`1m Aלƴ%4r lڋ*y͍J=E/0]<2NOQ_'d'J`sۛ/we_}P~o&ZUt]9}ؕPҽhhhPFyDFVKM0JrMr*H۰3Viͭ?oiKպ k<:pո[-T:Yk Zq9%Z?O`ߧtsy ˏZ|(90Z~Phk{Dk>ڎ"&]]3M R@JVbcrH9ݺG־lOd'0{> ^k(ol"+Rl"jr~a7SǗ?NBSл^O:,z;&ϋݣ?+Ϩ stؙVF*xЖqI`V}I1M*eE gii2#\`gʵB+RD"1\W`P|pErWV+R) +e<#\`k,\ZRSp5C\i4pEu>"+Tk8OWՂXiW(|IqX`;8ĕpB6"&yTZ[p5C\9,ck H& պi{9HPWoWjЫ3$ v޻%WoS+O>NJ WjTաCϝp$$‚;i w)vǎ>4 Vei{;/4¤iRډhQ0-K2}P~\%Ne8.[cQo2~y.\j}{~ U5@ۀrE/^ֱZ} ɰ=?~]p_ qd^,">eW|{67k.V_/߹`ojEH*gT«xi N+ڒ'X*3;(ivnɽ42GK~dekY%+NW·ʐ 0/Yk˫e=eF*rh+hjU;|(Q7 jɘH(Rl\l Z*b&6P,\\ir=U^q*.!SJtAW(W:k8R+RD qU2'g|AE" RL-eQpe8.P\%rpXqe&2 "\\M1U2y\JU+%)sLwErW&Վ*[ rǕ8q [l`?Ʌ SkNd]Ss\邫CnIWݵ. Pb6\ZRjrRv܂wʞ!t| N (X)$dcU XV%LRJgrr%lpErA+RM긒R & f+VpE]>rU[WҘJb299 p9b XWְqE*mq2  HvTZ]p5C\rZ Ml+T+MWR9*f ' [Hj:HW3ĕSpEu>'$fՎjTb]\Co(IqW#T<ُqr剝qjǩL-ϗ+SpuU>+2 ֊g+!\ZR\XVp5G\Qy6j4?gr{ffIkvF·/2_1))2ch1w=}E?-YR*!R `${F~exFMYջ)RlexF L hhkAtlHl-Z o-JtF"'O!>JB;R S)ܑ'lpr\pEjLWRJP<' kƲٜ3SҪJ&x>"8TƩ9XmN e++l."*7NW3ĕW(X|IfԪ+R9YPNf+e>w$H.\pEjU'VUp;g@nqOw5N Sip5NIlʎ-:t蹵dF"eNWֺqE*)#0'ҭ'E7PHrRI)yuxL0`: Q1M* +!SpEM>N0Ws\W2؂ K(T"+R ّJU9J_쥸B`qErM6u:Pj2 W(X>{W$C."R+R Pp5C\Y_=hR\`Hn>"TB #Ǩw2P8$X."Z+Ri8qnq5N=u8>ֱeR$+7WС\Zg2 qɵOWR\WN^y pr8 AwqI`B6V%U2uLJ- gi\z@kE0@/H)bMqsӚl%ڱ.e)B3jd5/բrb7j( b7#ULeb`&JbZRwI-;st_J2 g++Uܐʲ#8K\)!+ HQW$C*)!4> :WfjI@p];?OXgO1ntݗ~nwL럿[察po?>^?!;Փ#B,bJ`$| If,%*5ke1}O5)<3Lm] lIdys=%}s9ӿ> -CRἋvU.~nv->fuߚ7蓟8%Ï7Z䐔x\+InNh1!!Z& >с߆-ZDRJV4C D@_C8A9)?bw n~snKuly}kS.ӎ8ͳ]◌O vFZ W~j;qZm@XBo@-Zv@6]u~6KɃ_Qڽؘ~*8gDW^׸UG{g:J5UJ9+r>';:ZCկxq,_T/* _*u Qأx>]mey!fDW0Uk\誣u骣w񉪺ăxlTT (W<<!|^16柯<4JdIB_@r%w@oԷ'||Kb 9jTQke|z]E%iS E2i+5(mQo9ۻ{f}ݮl;,n:o˭wb}<,{jQf!l%D hUזMFFa4I YLƪYKrAHE I Yk@mSFS**ƆERsؘ?O1wҏ 4J  iK!W(0*PC8x[Ul"IZ29DE{(dDj"B>yj-%JdITaԭR :Kʎzr7^BRRk䒔QQT@F'()S*)V!ZIFRKOA( 1M)+Иd06:eZIPt$)SՇSºl <LG}{.C!uHJt8v 0io-$4)hR!TT2ds zrD#5ZEr! L \BO }&i_?*CFH[ҁCB'dHXlOĹL.VkysY55ksH(%9ʂM/ dJO:r(|4 AJNjqu~1J!uĄ,ڀ(̔`MDD/ 2%IkSm'MA E>I+u1뜵'Q\2c I&l.-6JR sc`䒳"ac+ ]0)(J͕Rl SI*e»P4.CZ=S% VTTPtB[Z y9mxAY]7n0P)b\+d%̃dp,C̍ KDh#48HI٠+5dC5ehXSV0 ^gFW4r(XaҴ1kQ Ki)~8PV}ƕ]b-PP]-8*b*1/H1x7%؆qo7WHƊHT"ی6Q6cKaJ +Q꫐Q64~ * _tk=8(&`^Ou J+h#;6B͐jPo|ͩ~W(c)hh5PN VCH(PD&TIDIۨY"2{J[SR : 9 & #|~x#Qv`2Ur9^I!0 y*Et b,2Ф~ζ7+@TT|FLF(J8PQFA(8tdI8w@&2#BADy XmڔRA5G?#ڍ >Mӓ&%\ uF;(B ,R!JHJYHXoڪכafV(g'(+TC(M m J'yˈi=ԯP(ۢ#TɠHԜEb,RuJ` @ Qti#ʅ0ATz hNLUY327 Wj A5iՃ*U|R~ͤ:IAL\сhaB5gm}ZZ-Q=[iO~c@TBmEwUlYtڠ@8XA¤ Vʣ-.h-L4EhQBb~)A( G{|t '=FMڃЉCXe(T qL*kNCqxo1KLYRt" sʨI8Ь쏆I8N]hEWrD4e.?"t-+VHAA([Bz] 8 _zyKʪ-1nm B_rv|^e.ҁmd޽{s ,˭.[s9gkU_?-t|Kx? nq{:lnNONrih>p}VvxɈpa`bćplN n6N @@hzEN b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N vz@h@=ph;|B: %%; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@ T {1'P\@ڲCwuʱ5:02@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; z@J[rsr kWBF he8x'P*Nb b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v"'K=ީ?,~KMQεE^^nBJYH"ǸƸq (`k0.}0Dx]̈lz.tђ?t(CWB3+F2p ]u h˙^!]Y I]r6teKsVCRWHWw6+}Uiſ_ߜղ7=,ߘۣ/\̓3+?*G)e)QKj>ʴ +{Ƒ@8`oqggccynaT8mef8%,ڦ񼺺_UWWGiw=r_>xPRMO0}Bo?KȳnC/Gͫې~Q@,ѻ1Y;G{)뽖.]Y!&A?'/Q胆zX^!Q/b1A}]Vu}JhzvQ/#8{0\V:EcZPߟjIWU7K*׃8jrׂ8ZAwq r~.VzNgFc9[ܵY+y8h-XMU.̾̆RxJb(]PDpefYjOo#fh䑐ln+0jYCi/GGɒ2LK$7X&YfIT!"DY)˂enH]Thۼ$fOL2ˬ _0hRjh 1T5]}ղpXdyaҋɭz? Gy^es\w?J0)0Xp.c"rwjpZ>nn/)'OfQQB:X` %*ˈe 1IL7FB 7ڲ$>[d(;A]f(`:QY{a]m?)_%.dV2ib3Fžas2M0\ k9 8w:7LdS9 u+H5VO9̌)y֌,ˑ] 餮-o14|.ikؒ[:ok6clmfUAdcj#V|4\Z?Onmֳ7ͭ;jK_A .ZR\B '_F〟VF{Q0QtPCg9?)~~~1U˗Am%KMM3o^)`F&Jܯ0DM?S% ]Uo6п8!^z8ǛwO_p|J9}/O_;'I `[H=_B\kǗ?oߴe]5 ͛K~{Ӛ.{=-ն. Öz2Zv Bz'G[o/Ԫq%#GxDP0terf%ָĹIVS0V`7{>җ 6̕KYKp+ٚ :o.O*sH:tAwԪ/nxg~Fð\/RyY(M¬$$b0:{KO4GDۥk:Ti2x ֩ qxb-@29w2~!+;d^_Y^(ޮAjAy)~kM+zMotN@BI۹^IVaDJ҆2*t0%%@%OỖC$܋"GQI‘ Q3]ٌU,0FLOIȹݫѠ.w4M@|ADl9Q%ciH{5Qۨv;H$] {I!I@BFWXoeF&l4xIqUQ@{*qv8ڗri<߻9<1j `V3B xj$åvqj{ O kڃ" 9>Q8?MUg1U`QيPu`UɌpėyV CUtcqLƘt[Dn cmW2г9yXؽ_<m͘΁en%SzO³D\D& _c7hR!; tq_ȹN]B;xgڥ;^Vb͟;e}6k,[~]?Ro!٫RIݐ괱\IjO(TU8}7TlAW `6=yc OC~zkM(iR~u'iI9JpsɈ.{ԾVW&?<f!SrVLaQ񯍌x=eyG@S}kuqF v׭ckVMV%뫓,]S~P;@]2'ae""[zm$Q73W׀\/IZxdFM(bΙJ9KZy{ ֺqB3Bۤw'zChs{$[]wa[v[X={"ǘswHj@].+^]: KaS0c["JQxI"WFi]{(݈]@q#<} b6P6PXo\Y)ƚFWDEE ML -Žt))-SW. ss"1F DeO09::~蘿Ijud~,'2Djbndɲ,yk$&I*<U CчK?a7'?*^oAI 6z}èUqz.䢦La+Oh-g^#g}uQrlDpQ'Wk=IBg'?Lfl^-tnie_Kk|Q4?y[W=70ݢ)La(#hx˛0Ƞ xI >T;˺ b<zy\PSy}]qXj`Ɓx?b7`G*2)(K%R7 [ ;Yq$ =+6\D/r`,zBN,aFcrLc0嚆F:G~nBUқ3X=z9 nj]{gi`q4:W.SQRQJ"tzX%`W>?{ȍeOTW.6ۻ  /4>möHv;{Y*ɒ%%.A+E]s/cyC΃xzm62D7ŏ,f/fl55!CP"mu{Ȕ5sFEL.|a 1YkeDYoC.3O^ yt)RZoGV` QDvhE771\f2a3E~iEfv$|t=@ϧ6 ǟ/m]~w)Z9̖(ڒay~67ymt+^j} Bnm6P_YW6^ۻν^i\KϧUǟ[Wۻ9tf֧Yw-+lY5{W=?4ysס絖CTcߺ9Y<;:n->$A9ruQoqX-M:dkj͋Bݍ,/htu^5[` |k!U#ۘ'߭>O$G*M Z^9,R{FVxH20ߋ_<"0Q62LҳĂL<9kA4Jo7bt\L̜G,}o^՟2dzcEP "%B9JpC@Y*eNrjn,S-yH4@&/ Mǝ6XF'߼~Õ %stq,ZUTѨ}J'c*JtT;7}NG {$] NWmxH ZE祒[E<Ρt)!,KTy#$eP)QJ}{hXKp=ΗJ@E:0G)1.&GEZ .-*D\r$0kx2 خ0enR=ە  Yfwv a.d!Tn=yWøZ3P0b0EPC,gQ$;{>iψ[%|uFL.-&ڥ/..;mvn:7-A/mnn[m}3ܱ.{Q59/;?y8ڝOֈXwvy{!K ~Yx]"megT,uV]ؒuJф?q_UF`:־q;]?~-twmwqZկijg?O&oJ=2tU~xH-?TƗّe`fH.,BncŎօ<^~niixd0)@<ΰMy=[,&[CV}.vʚV֔,D k`Z*9~< Wf"Fϝ r] Y$. 74bDS$H.E糞y oKUY7<ө5"HUrN vˀH.9*(^5 yիz^E6/vXtA^{:|O&=i~hRA/gՕ FDP1V tV(ơ!ӱeEej^J+A䌲SfUJ\{,Q04D)jopecd TD*T'--Xi2 y*Lq >{ɝ.x6oCh.rn3A4;v5$)-AZvҺI`*-IW_ !?jY`'S7+ToRɡk$(Yh/LK^ FbA|ILLBtk-8Z6U-.ѳso=FdN.z0h=G+miT,[uo8f哵:v+ʺǔXh8HKU6ٶ&yH9&V#vb sdӜ)}'ӳoTB砢89eTu,FT.j[#@8ePu+7 \1 5XRRI<7B^QFUH?\@=DB-R|0|1AʃLOV%aiWM*DZRo"8+;Jw%8TNB}A KOAf;p#eg0 i!quK6>2pH$oi"ٗRՖFc"e$Q^lŢ(Ebh18o3H$)\p/CMiObt }ʙzt:xA5De})!cL;hLTE=!qRD)p۔ݨ2; E_ _h>k@)E s87.B5EZ9unPlzp+O|s_40b&NJCdT#цTQ T"IXr7<>wqaK%8 A1TBL L#x #v1qv#ÚWR/- y"㆛D)9o 4qNf X OFK]<@tQ:}qQE5​V9IIbBP' JT 2InAHO4RCqcb_ (p}F5 ZK n~H(4%xW׭<a9/0`W_bEG%AŸ=JslPN(C%笞T6ͮ.oo%5f(Ji=Sj&|Iހr''Rq]ؐ*F)xYɆ=df;[NgnM''ruReoU*TN8,s*XNɴNNƴTL YZ }7- !L oдP\e>⾨ Bi5%},%.n"\TE+ WY`NP\IT*KzWYJzp%Ԝ\e=5WYZ{e0K3ہ+* ,UWS,}+]ER\>lyvVYmFv:?.Eq᷏ofWsfLN)OX[KJ[*C 4rww W6zl ~LnGfu![i ;DuwmMq>(s<;3pái|_R hr.}T&oK.zW!_x>bR`[4(|B((.%>ǔ -)aakcw{զw<;ԠnԑISI'HhK2^^ oքҾnࢗ)%'L-^B&#Na& KPC0 2+FtYnDJLU\( &PIS@:#f$L}FfӹgcǮoulqxv>׷ݽ֮zm/_^-Yg,:k_`e%l$ @qdJBG$ FaU A-:(xg0yt] jI%{VJ zVEcͦsY߀^W^~=}\a7Xdc))@"k'κg1-C8f7@(JB:6v sn)#|n v%Ie'nHZg}: `l%BT֗mHBF $TI&gH[~ù}]-ثv~T*tG.^ff*dm"Kr[S<KDr1CS%Yy$es Rnj+fEBi#lrl{ec%n6xfڱjeڱ)LIwDw_tW5ssYzz?2rU4k r]<,=FuyG$/$NBJ,(+hd6i]v<sm!6YəZ ;I-@yU)Y=+B60>ł!2m }g^Sb.^$I+]3N2^*AH8-4Js(b0RcHY *g cwO׭M_MUU߿`mN}iUZer{3ҟѿ[]Co^nԱBV"tsu eׅ'jcW0j3˾\ٓ5ƿ!/hm`dl̉<ɢz[te5oY6vSΑI# #qR-*ʦ]} 【1WL~7>^kҌ)FDa})N5E<ۗ>dA}PYwT>i=O,(ܸʳVi;/(x0:pF< ]bU{z4T4$ y̞'VG[͘3CDI+/JF2/JIW{vMpw0h P8PIk_ʦn]t_|5j6G B=Ɋt;E2̼ǚ3;=(ZGY*Җ( 2)tMRC3FdZPLQ0 !C0QRi(3R.Ebґ!pD _iXZdh6M+OXannyvq;Venc9Ь { 煠^ 0Gsr"(7PD.i{ji_RJ= nn@hr9JLw mn8e}uC'@S8/ahG"(>'\ C?%18)X$ؔ؟:v- ʷ,Y|Uvų'u4t >ORS&[AͨЫ_̏租]uo*Q+xϪн3ָyjXW|mEh{5lGBHZթ5b]eu6Cj4|htɊ.*uAeJiD]Z^=jsKk,I;uL^[cHUd .YK &RNY,iZm R>J9yaf=];/nض=<^ vjZx-]l֮A=;TAjTZ#eЦCMSdG`Gp#AGGhK 񵚋!xv;L}>V6 F _*e[ 6 l4dRc=)g] 'Y9 o%ݸ}%aԒ`2-Wu QBUNO1 K6YDES✲,$- L!RRbf${lgDdZ-*v>9 uyI. 9@&r+zEh%J g e4Vz/Y-Ced[ْ1V8pXϚMzmqoT>/Ū@S#Bt|b XgkXPlm|"AXdU |Vea__ޫ,%P`%Im:*QlR>y6Zl#&9BfP+^F4g, n= #{Mv Ct%(v$spK"v^2|2gm(h8L&n=`M;雷BD zSQ@T$cK:?:~]DVǎ_b_ƒe"!e&c6Bw,>mćc8\ \tsgjiq_f}wK͘1_bj$jKN:-%A*Z0ٻFn,W4jh1f1f̗ѶHg>VddK ĐXe<<>*㨯,DT@/A/a^%eJ,9u`p4*k攴1꤈s)Ĩ T2bAP0p()kJ@p)18Sp`Lye,wV ^Nn|K~2'Ө}-mx&A XSA&Ngx0٣kҚܼISv 2Z3w q)t!BZo_-CFI5W+f{{Xe{|pӀ]~vdmz]Vo-6l)j"=n᧫jrޥC׀l.Ml.W_3t.Ɏ%d|ۋ]( 0+ՕCfs'm|[w.h]AzغjZo^xe^Uf!eծݻz?;rzN&[|~̻ynaytKmduZz-ri.hmBj͋Vݎm/`m[$j,mdi z22kr_|;\RD}e2HW3aJV|U=FTbUŋU-6x* %]lSua#zQeV% e&A)XѰcpD9D&qSÆfQ R/3:[d q?hGS+RQQMZT5~;7PbNB9I .'Xj'LvJP; NV3Z>`[ /s&*L$̓0FݠDW>9b`"M2Q8-ȂN>( sY 'GJ9] j}}B#gN=pJofySj%B^&hzd $XAe,e8g +9c$B؈At~ =|iȓcdwGPAfXg4y_IbU@52$)Y8+: l?Gy:aP.Lz@otrr> .tv>}=SEf)L;ps~.6tAoNݛ};tOXvZaLm$~Q顜d蒳KSߜ{O-Įwn:z7y;~iW7SR> !f8ӷ߯qz>?:yjw:Y!'WNO4z,a2k$ib'Xf;u|E<ޅO$#Lez/ӏ ]=~s'vqZc_{2s9d [.@x$ߔ$V27PЪX%zE<ߴh6x+N3w2}pBbn㓊;+H$\Yni>,H .E糞y /KUY7zU ثz =/CA^{:t%=iHP21:'<ֲ&dwzED?7N ߊXx%˕TVYvI.xziQĜ^t3ߚꈠߋF4NМ ka%q*$HOJT4}ثle\.a=3q\&V1#]DQ2n)qv [io4*_bٝ|QL6XSYMp>d7I-*lǤJ73I<bXg!D)o2Kg`V 2)]8O Wq6K6 \0j jTX!c2k{h9 *uIc6a6 hhimI2 w@uJN1 4MRY`% %bpq. +Bo&L J){qK@gq3ana0`UOX.4xe \>+댩'*Pt02iNِp(#e$@Y7LGb)E,:֌RE7<™YĬctAQFuLD%5z8Pu@}҂_[<,l'GI&5!1@BRo+USEŮNUqO7ZdC/Se~e}p+#9cGܝÒ_PmT IQK@Q_eX%8RT2@C6$-=9oR9'o9r|ŽfcO|^J1RlZݖ#"րbGgi\W?V~4Z^rsCG7sva|A {ΞIǟI5E'C鉷=ΖFZ1Y5KktjQ:PWTB1Sli[-Lbs:#2l~[C%%4TYPʂDCтonm/YJĄzQ!LrN)hlLhEt qX) )E/(18-$F39RLhPS8;zl>Y?{6b5$BJ9cxVFДN.5K (s4hSv&Ae*ن$WL/9&WTy07gFmTSi׹#@uoIx>&1mεkr^4h)EDUՀ"ng1Ls0G!FSq% @H)ds("Zzt$6+M9n8A>PUj“ 4iD`>w*&Ng .Dd,JES.nZNaW$NPCc 7RU"coM;x\ [}C¶nR~gVkQ\qƍ n~|%Gõ(i8/ ] 5?-ɴhiInM>z+q3 ,?r2{\&?}^zt`M?=}h }:u@|4/ 'WC"|m;vmfK^ }T56hw?SǚČ?, 5@D-58 ̯YF_=u8QAۇ^zj:\Zw=HM  ͏PZW(ʉ5+N+p5c%u]Ua d3B` cbDɁD?Wd46>ghEԈ .bZqO=nfL[YH`n+L\ LQJ~0-\x ĕ \LPJzp%,\裁+c+˾J)vJZ1)(GW(&W(=B)- •pzDpbqW(=B)pJ[1ٮ@`P\nPZIW("\A{Lp[qվSO="vꐅ@@a$(< Ah; A,'Hp4G>5% 8nA18&Y{(uT QZ `AGpj{L`SIolV5ƂxNu6D6JZ­5gCUF~_/)؃ W݉ܒ%P7<ݳ1-BJi v!XK<"9b !E`8e )0Ik4b<fKH(ŁqƙsgՎ1eT)6[kΖ1w!H 'I, 1aQXN\y8,'$&TB4WMu~ib5T#BiNNӜkUa«Vdh 6@ljrU/ ;$܆:H!q!a=HJ3cX+-E R|kXdIel}>0?T'^=.F/`n{AD@R܁IY&]! -y .F"5Cbu .789w|t=k/UDpg@Ju!ƹ\y-`Y[|PJ©%qU 8b>QJ*m;lS!]16KvU2G͇OP}G{r6 `o<1)`gd={#mj>wfns57r8|Z:i;rhji_OnhtY˴&c`1$1~O$ց}2 Z]#kDMm+qgug#ۓy\J[q?:Nz ɯ#Xu:߆A |ףisb0:hrm(k<- m'<3W, 'ϴ]ZbhEbЎUL8AzE1*@RG#AXNoQk, 58~hEv,Kn/c4xy߫g-r],uf b>*arDfi\RFH9"94ш;mKq"%D!Łggh$)*"hjz"3 QlM`S>'#WnMK2It/^Ϻ ӲRMryie,i42k,`S \{&&iehTA`69s6gɺٶo'2 `*R%wMUN4'mF3+hAd"Kƒh܏4)8Aƾ]ut`cɽAy-Uf#hO#ʐ&- DBK FSx9!+ *A J4#("eIT1idV +SviD4%ţXŨ/%`Hg#,L!KrAh81~= b{|qq_d{>jd}qqӒXrκ0/Vmt&dTk< sx7Ws헏s;037nq, H]a`t `UT9 @v@y0LFLIG_UPGo-qN|8:t"鸾ʛk?}Ӽ`OqYGǧ70*ruYZ_kտO~>x 6h1[o2s3KΑ`8jR2|o18xյ5)Xkl6Vchfy| q0`ż@W7u.{{:gikwֆl\Ze~KHXpsWU S'+.9"PNB 2vAx߫{\8o4 E<0Jh8ޝU?+nտﶊ{*a>bo0}9@E~xߏoO~>})ޝ~-yD/74AhwO$g{4NxUVP߼j7Wl׳ߢ^erC]omY/CB4=B4%uM?y$P>Z?d# UF` LϮki 3UT,6Ft4m$6 X$鱝 sj~HBx0&v=sC`PHO`n"Y+,os9K>s A^R9iLѡܶ&v"]Dr;!>';/ RUXUv&lG  [OHFf<ǡ&m\*q+HM -C"}Y(ʥ$,8؄q%@78F! xp]_LvS N(}99d-4dYj/о^ty+Fu6~!ĤߺXg>d2^=T;cm<߶31LLLyq`ۚ*{t6<\7~^yEu꺚ˬW6z7.Ɇ,| eK8g[7ަUCl6k/:zA=SؚI*moo96xg4MojݾN7/o.|{޼=u͟qm<(Hn W}K'e4ݟmMCG<*]kux#T;046N}8vڕI}Z$8!F =*1`A商A񃋲߇VZ~?,CE?1h3LVy`w[< q.%ÔF3ڤT]cPŲ*M,osd?FQoϚbL嗳shEbC/ X /uNpٵeU" 4%4 /VD#<[E1AbgZ{9_!N<~,_'׈F?jEvO!A"p5 xW 53]OTRyᅵ*5#9IJ#[5٨}NC }BggVGiL>LP;_ph|x@z+>P''CeZPuII TxƓi kg@ y24 )I:`T4^,I|/3֪u'u$׿.^͸UL2g/wwnf7.M]4?W)=HZS Vgm2{g*VY:zUW<=eER2&DDu?h#Js(IMR#:c9i˫U/NMC^{:`y[RUu49Fyut ͬˬ YG0޺/!4otY;˗]}FFS uڏ^(Zɨ@zS PR6b֔`u}-bpR\Wt l-K6@` (yzڣ׎a%J3s6IoqOގ^~XPՓw1}MdC[M,?aTä&nf 1_}!J1r_mfQ;H{E[L,HTt\itsJ;V0D3 |!޺jѓlw-!9ZO;de{ř[mVdEby)3?ty A`JYHV8$ETⵝ|PU)mbߏ$>{>3mٸCva^(1]y!Hu5HL>w`ER@QdhecA_6hk!= >?*Oo>KshgSkح6œ$4U+\^R_#' Yی&3!f JzQ+otn& LD1iOu lRم\a@8)L Ql7❙8ڛ%ldrZ5QB5#qڬ+[3u w(CHnrH\2(eT! Jde7$iUtMt)<3희%{ă?;6%*&%M'QTq. 9@ZUPxe R! d7Lu-6gcn:Z+pE\k)(&O*ɁjuF' qy]UJPL*K!$] ˦,: %5T1`@' s&mU5Mgsbj,첚f9Nq2: UԋeԝJ (tT$r%W7:? mjZzll|Ա-RJ7kS&%Jբ'+imlXD%PTqCR%*" g5Gw$cM_|wt6Cݱw׸ z5 kgs՛)M{R5ץkSPZ"QP C %8t¿/AF22biAj_lSv䙽6/c+뾴l$Sœ9oxi6@ V%liIGcIY`.C^Y <i)r䛴IX&%:vyvT㨺BUz3g{ٹ'tJ\"YZsgڭf]v2 tO_$^fP m1bs98$dQ44Ef٠qJOCH.$IɵdmƺcXj.ƞ@fo eZQ~׊Yj=Q/}~G|^]?3nn~orsj}j_+5YemZٔ=G2zN=((ՒR7FW^e$2ǓʬiP?phVTOywTWO*|+jǓ/l yڤjw3יg%]PݬTs90Ek-k:[?k,YdL}7'&/̫H`쯤SoΖ>--Vp>vbhצvxsvurzŃZ]3W |L`Qa,ߌx60/M?1.OF_oG@p"= ?&tTᣘ*R5eijDJWRkQJ5וj i y.:+:KuF0A :!&ѳ&kDf "+9")46T"o=À9$g@Jr"a]ML_qFjl3蔊R'|чXʂ$ =uIQq&Gr:k!:͈"ꠢr*Z4f:Pyˎʮ);Hg$b8UeZIrWWN0t@pڃI*ck%@Z)W_ \ᆏhWO2 0\=geӬ/$++|\W>zPZ \U8bsUv;U^!\Ii+3W鿏nVͧr3)x;~MhtvN#A̅ 5gFFwg[%k tۉZi_zuŏOyr4ZIOxwZ+!%$2\}I95idlr*EZ\CY\\ZI4,.pqQWi|ZR;OR^!\!"Cid3tW\<vU5pV*!zpEȁ; b5\Usj-UrecWA8 bU5WCj}j5•!a: I;as8j->VauJ^>vʭKG~2ӒUu;]t@}鼹ۊ޶YlJ9W66ilqaQ1(xkcnynȬ`:r/F۹'yw/t8by{etm@1KGe Ri|DrT#룴^dr֘H׈;͗$[H(o‹2Ѩ]+W3emqC7܀]Rcz6#bb2 zO5d*~!AsA7A5-BkPH"zQMl(Jˠ|AT]2x^SJDTl&GQedqtcZAv*PE2Y9IXl3sgWS4U _Z2䑼J!%!JOIgK%"RR#tR 'x ̴.] 2xJ09G<& _4 ]P3W,<"#A+s<켵6x_J`<_ͣ/+zvSN&h4iB*"^gJs9UyK ZWܤx)">#@j9CW4mUi6eÇ ʆO|b-O/.k;iWųsz`dEijЋd# !{h1KI\}sև[eVf:\٣HL͘1AVe 1Hn("Y{&|YJ! +\H)g{HW!O6iCxA2`ELN/I"}_u7/MRRSCbիz;VJHEFҰ)1rvS1әjuY;s<ݮ+%eKaz7Pz}Joo>[TTB=Cma}4Rp @DԦKPlAZaL(q#-rkn7?w[{g ۮ}Kwr ȽzضhZ aE4-`|s- tA]+qрc`bҙVgpb#Vj]J뺻^̔^e$Up;oB!zĄ3cĜE NVFZZww+P1Y?־sYX_˿ORE yXԻjVU=NV{&UcFߔ7085+׊>& FL_4!b!DeEh-D%K>x$F8H 4kG0#NSVcK,-W_2<`Ha;/=W_r? (X24RvNHX,Dk.:;(>ypJDX,-,x0l\~ZWyl!g+:%n-4a1^od).t"Ѭ]#fA똥>vjtk[6bM;86 Ye2MQlyR#OƏő#BqMY'Ucc{%aȳ 3YDHh x*2F-i]iklWSݚײ֣s8J*R94B p@'H!PP`QFTIMY˘;ؾmF")9ȔaQ* J3) M#AyQQw_~,LME2t g3 `;SrLMAHTp!%cAϲ/Γ1PF`0/8r %28HnA$Ph4bD ef.c;)4J>j Y#lY*$WH#9ϕ 0\ o@#{J5*[qᏧi_ҒB>v*]xk]|kZ-cBQEa>iMÜvOK3.BVårq@e";}iuʀ0QDF8; @2@SDNB J1u'&oN)Iz6W D)( p'16Xo`z/ iY^-ohc`UM!Fk{;ꯐC`](PLJҍЉۧ 8:!JJ38'.UgU;edZ_ oŃfw]bRe%~\Y+b G_f"ll-+nW7TV$8G CjeuìnO> G|LNѰ3_|9foЛn.˗m'Y7j\@\'~sHi4qY:IBOpgT7uʈZtZ=CEu~}ae.w8Evd~^J t M<?Jh45hzyU]T.U m{qƗ$yݏާ?p1Q?9x/R^ \Gcd>;C/ISCx$=YW ǸR79q=nrmY! |!G굃%Jߧ0mUU>Z%a]w "MКŒ 3D,JUk6Nl1S%+1 M=NXQ!@-wR5c8Wtrۛ[lmJK/iN'۷6} 5;ʰ9C{𘛝ב.}9EuuySn>$ͬޱﳋiA%Vf"B3%8xJ KM۠6aAh|@'=(vBB3)aZAK<{㨔x L Q:68g9 =՛R{J,A:?}6BMZ5]Wۖxr<[&˄# [:);?Z(0Q2!|g: Rw`űmw^~|'؍l u<]U}\Ϳ&$ Yw2u.@1$3&K!3e4fy5E->4r4쭐/&6L`&285Yf YJbgSq!Fkt>ퟝ{Q:ղY:K/&A{*{Ւ7Zq =^(IgCRvIO4R! kR_3KLp+ͩ(j9PF°'H %Rs6]vZc/&en_WLj ǧ;e`TfwBs\a.|Ivҩݒ5IC"kϮWvIg{|[\fc0ۻN^)`a LE*uYz{|G6;B{vjB Zfnݿm{eݝW=z^i}?|͟nN[y=y?X/*2Uˁk8z#bkΧ?o_MN_4m%s,2W2X'@gt*樷DƨyMM28լvܖJU1ߜGgk9tn `{1!2 GZ awcÝP)ct&hd$= 8y+"^q+Z A neTG&gc0 /Z[1ɗO+'ZCZ#$?" ֻ*Wz&.K(mbP=EॕybQ(hbr!8p56c&622L)hC{Ę)<dz[VI *jrG[׆1xD4M TSԉHAٗIC {~4OSAm<0 [?L`<]WV LR?1Yb 1yML ͞y]{4kXΉ&,%䡖02YQCٌTGjxZgO) k4CQ@.hq+QdF y5TW߁9X 9c2Z<ŖC~flŘ:[f^ځ.^dU':YC£U/8-BFyRDžJӓ3ǍHb?%8/ m$rFb *-"}$߯zr%ڡ5,QLtUOl:6Ep` e\CLF%$Jttl Af^P[FKm)I^X&&;DIXW<u+$-2$iEM.$`RЃGgJKc8<w޾[jkscT.cvԛjvZo;07–N>8U깳&1~P4mу'9 Ni8iW:, !Im;Џ|Ao]D1&$9??hvy&Wɚ?a(3wJeɸMӿFp躔ٶ-89 AziKY(*T DTJ } 9SpSj#ɑ51[7/xȡC' a@9>AAMn Vn ;mtIu$$bù̐3,2gu t1ӻԆ0vx4BJec\<;GNJf 6Cnף v6؈d-17XKuv!WN)28)m0lx3q>WU LF|&;k<|CP=hkv嵟]_Mݳ=_zYϗ_N mE۠K6Y.S=XVV6SamRRPMK h,wplĂG;{aK섖hkqq>9t ey\TY 3(9KWoNE* Ɋ%Am&m*l߲1*T(gK!tc;k&Ζvo3[M?A VdBk<"DG]u6%iM@O$+ul-AZ_8c42(=d$ cX|!أ&(6O߀6FW8iV;]5IG5y*If] Ӧ*TuIdhWI> R8Ǡv4Ӊv0fA4fv+BDœBL4vGEƱ%Wg돎)eDX<ЗC1f*Ʒ,)f_;ѢTwCBDKR!JKp 'l |֣y33ٻCfu`c1 z}Y'_,Xo>qs\w$jʲ a,6ybr}| {/^fF(.LP<`A .YQc=J3Xr{ryze:Qr"' Խ)>e,JQɲ#{.:#iLLƶ<$c_uLRմp6xss.IN|YmE yS,Dl!_ؑV8EDEZ˜"8`TʲN~6ycTAXMkԗܙUr]s&ņ^N?_̃z˓Jg¯O~;._gߞ]__ĵ v?i +r3~R)yc%CY?PY? gbwCg7usfP ũ'$>z-ϖujvcnte?yO?]vT8~u6]`9Pˣwar|7~2kp~Ia؛w+ '닕 2._LSCd I9dEgbg̫ K7tut!2jF{`[oYvJhrrD{?Y{+IMSF'qZa MCMɳsAڔEF]Y5Q@)P70K((!L((4욉3= ,}J`=W||qmiU|c(ڭ89sV;:w)bVv|-M-Żd)FUZY]$R>9ތN>N>ԣ溔]/w!ja=<10${l۽E/CZڼ;?Γ8zȓZ7+>NS'ۿnvtC6~_O;zoCΥJۯ+ZK34z˝zp*]'rHi_h3/^l:ШI G!u(AT NNJQTiŌHPz@8zT$58Rg̊<%&$1z6D$t4g!jCX+}R)4Jȣg&#fY0$ƻHкh8[#wW4m6]eC)KQa>4GRS(JS%m5jj &p2tBȜJT:H@Erpcܒ~4Ͱae6JjҞ]tFҡ5&$=HȚX|<*4kռU!x/#=qil3ZiWމg%j!]gW<ϻ8=r]^o}0NTW;~,FhcIX3~\}2*fzS\v+?.ۣ6wzu*ِah3|=Y'm"eU+r_F)*oAUS>c l^պʄؽdIhp1zr:[m"#-)ϓ@T =96KۜIiL#c;_W+ Mc, +]]Wx`A/b/~BgϾ|/IHRr:e%Mɔ9aRe/ٛ$wT^綝%4j:UM6"dKplB.+كƈL#vMZ9V16#j생vD:h]IRBYr uJԩ$і-oÞ[js{‡T#v&m&kAYcWDƈ#"pu)E@SH") ŨTd@sq#Z8RW+gA#fvMt9%,DbflQ~:Pna\Cu6Ӓ]qqqra-: "Icd 9s~QscY47dpͫ9s^Eё %qݫ_=̓D{.Uj! Ó lr;b/RBѐ呯߾܊qA&An1;jM]10^vYB @l0 {`Z[%o6N;h*!F{6RHLĎ+k?8B*!>3xɩnnd5V+5iCa|*)ii4\ sr;e䛵 hMe`"̩ѧ<&?ͳj~j;gEN<ĿyѢgo~cr(PIh2 UZeYR3 _afAc5#+X \ XJWUu +9ef>xWUZM`LeHGW,0に*WUZ'W,c05ZB\U@3VpU%w,pUsF$G "٫:gY]loU-`/IeE}r^_˵)_ͳ鷫_-*;HvT뤠lu ey  ow*XVhW,E͟_L_ZUȵLֶ< &lTAi] h()kgGE^a9#E& ##bK+YLƿfV/axoOz>It9`h=w[>d#~IY~z;5).>J٩QQtN VWV>o-ݤe(of7'[~>jP K9fL@\N;͗e{7?|].2UqVY`Th*>JkJct"u~{Yئ@pp> |F vRo{h‡mI^vk@тlD :r,G(RE'4(,R9T8"%%Qwe$^yD^X~1lv<0"/hԐlYde;&NO;kO`f*#J-1ƤU 8En j4T6ŋ8@M,q=;? z]2e2c:c2>ѻUMH_J&(FU j'qxz[2 kV#O<쁶Mu;덋,i2tAXR)mGDZLoi]K'Nj@>j|#>۴C:ұ@/ڔY`+1W^9r|ѕD'z6eDHoD5׹j|m;vnUII-KNY&IpϥU2;>R1Lr4zc͐UӐSCW3&A-𺻁 i!ͶK;KPZ Q P c; rҾbex 5ޒbJ8mE)j%u%2Śvisxz7JwOp5!ELIAm0$ߴ@,bLRBqaP/8{g|!dC&ifͯQpAo_J$b-R i<605Lܘjv>JxRc\؝t.|g}y wzQicJ7mg:UJez;GE[tKXs|2WK6ҟK֙"hhtCzqΖL|UѧNH#l?? {#b>?2Gwk޻>3f.4 V}ro= =<-iylb}m{l~f̴SM9G^+vLksw_!WhUe⾱LK0aTpp$NJWV?.x 0! tWZh",3T hgL ^1{ي1 <`.#2 f!! EFjeVL'9œu@]Sָ WQmIw|>|ָm)ryV YlCJT9{!BP*$'-6׺V:f4*ƻAo`6?Pkt }@vߎ;7֚}.zhQIZ{')>Epsz:O˞W_z/?i ńO|H g  $JKMyt:=`9;it"yiL6&[7?;bvSRd">x$}]m?D:P6'}QB]]z%lۅYvsT*'Kj MdI%L^TjgEF9:1h {!Cf>RsBtAGRzlc޿JP;ʪJU¼elϗ7vKT6_6'VoF㘱9ȴ)3kY 9Fb=::gX]|t|6ϓhwG:$!')'ʍ9%}SNy \p]rwtA=hC23S!)V%i) j/^嚌/b~sQ.KץO3&-f](fo?-,~SN/zMoO{{/{[$k")- kf.9;te)o"}/5'w];O{P wyys*SޝR U=9.oKoAGv= ?z9 8IdJX m]$NiY&[plOy^]9Gutږ'o),wZ<~᫛-T[$cX Pkc)73/5@:Zޓ[3VMo5kd>ب,dA;YmKYu;gPCw'wVp'pItGޙ渢~2qpy0Z6P˳Uu'i @q}Q/7*^ލ0]%w<:,p47+Q&'b. 3B4҆zpa i u&_NB}A/1R}zSPyےMual|,/ SD 4(hR‚.o22^x y:Fo3T N@sP-j-Қ8n`#F]'sqHǘ2DAc:2\Hc2GUd 5*jDFvPQ&-Ghސs}OAiM yXQjeUejOMW"Z'O(:q4T-buw8XWOCNC!_f)c4yj6ha(!զ< [)AVyKqVQN!$H@n2s,q isST Fy*aak#X([BcĬ*|7$OVqaL"d{v8_/-)!iȨL"Cd+t i@:Z$H4ǔkh[avQX31;ݹԖ=j EDq 7FSrrIhdeZ0 OW]<@UX`mGuH;%j笫(ZɬYB $r2'KY( վtN- ہTu.Rw(Rw"R %AKO8K{(PgTjF+*htN z4o$>z.G8-4)d}1͹ӗ3#IRDA[Qmyi;q$'C!JO]ԭ^|$Pk2 C6/ǚ.lvKogٮIvS50kk\d_^)+P([<*"S'S>~TCVo礐$b-Z$*&ADc|H[ڔϐ M2$(m7W[i_{ej/ݟ|l멨fW٩KTZ Wc.Obyp2ZՋMs'n-oƄTlꢾ<𿕋̻|qt})b%̥}0Jg5H1~Ϫ8V];QnmIƖeS3dc3C#lqO؃,> o=msp9rVK6[WN{?.5$7,qez0{rDw rE$ u>1Wn\dLSXbF!MM Fzig\J{%ڬ dl28rWkm"[+%Ht9M*KtZYTI|קҫ:a;\l9YF;5o)'Z4 ȿJa+-~Rr IJ<eSd1H@FiM;IvqU$(n(ٞB- )>)\2i@CġڸdX 6@ŷiR>Yůଢ଼RƅD WKó^!o.jH`Ԗ/z됎Mb~]{¶V^Π܁BjvV} GY ;--~ b8-+#U2^ K\c%IMpws9|aOVo~bgi[7M+Aš$]\`{@vKIZ Et:!uIq#壎9/JК8ө(ԥzrOLJx9őy5{83|L?k-Ãӗ`O=JG"ΨO# -|="(w _ s"*lOeNmcSL30uݍ0+ 1-b>&p8.o@Z9ee)޼+?^+DP@gZ8R?7 @ @#!25?ul#ڊr-jd}|ۺ8C.y6T\Q14y3BU(aQlލ\ {ob45veٷX?춰@/!|Tݵ6aӻ4'-X^VksU(D[ɶgZ_i)Bqɲ% E"yB}nu\pCZZF*R)<1ΔɜG9g]r`=jkوV4=>XW h䮪:"$74OZp/G7@S7I/8/v럇9k|q+nm3|Oӆ(:ջ5uIvY`xTu 0-cPݭJ(mP$+o46ɞݮgABthF= p[Bv%I$5< t.Ϣ0!DiCY|,VekY\M-򵫗-Džm롍]P켾QYwTϗ/X\:~6Q 4T[TbjYd@qHr+%DRM ɁFEC2RI.R3HҜsUz-*U୉oY߀OYj *][z',ݦk&6e,KPR֛ȷ ,wYM1y@Ǔ@- pU pP<9wKN>wvq+x擛 l0Fw:g~wϓJs?OJ]gdp,RqBH7Y'd;x@oy]6mtӖk՝7K4u+tXanP>ܶEř+g޲Z-j_3OF|y8H{^{wǪa'+QW6 @~򧧆UUP\AžUVUR^!\1f ]{.էbY r8]c\e]~0&#C/◟O4`o"}{dIV8&!!zᄛ=,{(.j_`:Kia:K6S0z`Z>JXK7p{c@d*KtW$L=+S7pe|_*Kˡp\kի+EĮz,!W(-#p++M@}Z ŕtoUUR^#\_[z_TkmFE/ ܶfq|f7H&w30Hi %-g2brdz,3c[nVx_[7%k] ^.˰9ds rF d Y$[]kӍ5~n)eD״w~3՛MVs4җ-\,U$xaЦdTB9ZeQX]P# = V%d|WlT^˛sgԞM..{{.~~s;Ou[]>3tv) s)V v4NZ_vl}}Ϗ0ͽ-*/3d ~LOzN؂s5eӆjV@1hTqΰG3Λ/t;7ΛlC+ QTـ#9p1ʃ%;ݻąh fZƪy4Dz%u w)aP\>Vˌg;;J1ȝ7CJww B7zzRQPDIcJ9b X\fր)sD>K:#tN]GMz)%!.&AϚ0yyL*X!XQQeN9*a2S7UXU6*yze'#b+ܣO{Xa#KRavݵ_l"Z5*zO @ 9c^8QĚ#фW8`OL P)xM4VlF!":e4!]d-9 7sjGjk,XTmqUxI˃ݾLڏm\m/'4~:NgI:Ip$iVn`"6yK+L>Ȭ1Q0B)GaS Fm&nGFf\R2j2gwޠCz}PkEU[.Ixs+RwUk(P,DSί.c$^ (x߂q*{uǐF'*d-HX@G5 ă "ĢF\ ϙ'ddK Qj ֔ <)u2v0b@ڤ9AO9 Odd1 i'@'r҆,d&+F:.='gR;<&5~:!Ե5BDaS>b}v"t7;谬ۯєoy1 hc\"D:s1Jd+-1$K vwp˂"?Y\5bCB{gV|`2h|@εxM)`z芹Fb=xbEjwƲ1dS7ur(ksmi*PMgŠE7'ilg >e7 [dkq哯W<;L >yU>.4dzIZ45r6d0t5RPr6L3*K7Z:VfVu}0 !Tt;3{VfoSxsc 1P7Wӂ}K~/u^ cٰ;RB0J:ER1iXuU)h ~h*e!`dK%LHdDg*/V3IrCZu'=2mw7۬2wgn{BwW91SFsyr tрsےdp/7gFBFHY4L *Eu3qn'<$rHT *GV[Q9=x+ r>_ͱW__pb#5sF? 1K+_ejSEs1 ,!BKPI8 E҂7EQR`㸍Mɉ8'ʼnAr;&9,ר zes1Y8%3¬2f'>uЃxu:?[WW4!-/`d1Vʈxi@a52!, J3 ։dQ&svd]wű99qQkVLAX \y)آxH]J.h;3*z6VWͥO=.R"Xr 0 .sCx ׄTF(4ZpڑsR'L$R^9$Sd*Y_ ˔ 1J٣=B$ PgDΞL[ _ގ63-eSȴ̏5J1|V;L h(x[ԨƥatgH,V394. f;6Gj2dha3Ͱ֖Yd#Dʔ ]q;:FJ.ot4u0䘣 E@ʙK:aŏ&>fb>7/#OOC7?5>KG #kjz]zv8!ebeE(>$jp};7V:בk eƉRuF;ʾݺ٣ej~~\?d9z[V7jϽoܴږҰe,jhG%h͗.l.AI܄R}3ۤ]J+sue:Oatp=RoҎUobܪ'Q\B77>p?{YXvA<ل= ?}4nFM[ij?]jrOnImٴϱ ! ޗvGA cw}ߔ#m,8Tes$AH3b6ΉMrSe|a 6 ,'CpXO0@mn5q!(zg"w{A&G:'Xjuy`3u륌}1@̗{u)/ks/9S3X}]io#ɑ+>ٻS.f`~접GDH"5<ԣ67:HQdRkddUFċn2RHp'zxÀJ=蕖Ir'$ aT)icԠsbT\"XCQxbr rm@ 8Z(OJg0,&NF"ΏhoI1g 030АK"2:@Kt$IV' |FC?/%t!:.utrf̛y4țeܘG:nnL45- ki Ynyϣ_նlCM'ӯJѭr˯:$)8P_L%s0!L'2pUZ1^V4jlfa:ʔH! e4NY ͢Q=%guL[Dy[4xM6boX}K7m"iW>9L`7IoO= GwG \n=~~'|C/p1\&{4vg|~N+mWt7)͜L_xh k=-q޻&<:)~폀΃vC6Rnl-?5yXNMMmz;x|U>g2h:^P;~{QAS 'iW6~xӠqi,;w,ܾll:Qs 1̇[ѿ,sxrK9=_JdكN I*LĀsPA˲ߥLZs>_3(EIp¢] aG (Za3áKZC0eTE[vJoy~ӏ_0Oq>^κv7QVvﯺń[#d^8ALyiDQXn@KgI[X2g<|4=,XǢIVݱ>FBGdc9F<d:8}<=d8_r~qºPb=$Rj$p%W"J1g&a%7&d9뵾`RZYoC:)(2zL'ґ?.`wjs! F<+p~x=q/Ayƕʔ” ^4hc*jJ$Z'*!z3t3 A-\||yvr&E|=>8g d6VL[('fuSF~)PÈUNw@:|O]_g˦C{ioWԝ_:(9>v@ߑi2L'7 /N.Lf9lU#zۑ7#x_-Gn%nDNK- VL !ON+'wRt]Rl8Qϸ(bNY+* Cbll_C vxpvqAܩty}LsҤf^bYϝ\6)Zց^reݻ춴7fF%ʗECbJX*mTbJw0V35n]ɤ5$ye;<Ls^<-+쐕U[-e]d|by)lW]vbиdtP^WQ>}/ @}g|9"jQ<:'Ί6P("  FSN\*N+OG&렌BPQ9J%leL3R)ͩKkeŋ86`Vd`)0+,#qܬRK'aU.?\bx]p/(;EGqJItsks͓4VhDȴጉ5lWl^shW"ٱN#MAni]rG`rÿ<C7A1 @mҮ,S#q8SP@О)MXBg -$Ru[^Xϊz+P.mzDlϠK'Ω>Xo\I)o#+"*ųث@IZk7/<]zF5q3DDJ|9I: ~)$#V1f}I =S5"5H@ɒ 5^H@LKF3L4(b84M9iYiE;A&n:T $p:e+T$Hm^2 ׶r+"(|U>/zz(Wll|u)k7"HNR朗p6d0H> WrcU_^v!j*[\Oc e6& b?.]R%$&/lPw^p%g'n6߭'l\(F &JSy'T%M&V`Nb)YRL=v./,F> Dc|6E@d5auhD<(aG2\ $<$}ezH뮲[Hݖ\FaH(I+% eI'3H$09T QZ,6'^GT+wdB}BA\AWR/*i2W6TRRk'ATYgL%*Fg,7·_2[ENA#+'C #QD,:֌RE7I,XEb1:\P; c( :)xk"&r B2*Z$y #bl~$qX/LI—cgt=:*K jo(1N@BΪv?HD!DlryMkVW'Z*>Y&A. \trDR` rdoh o%^ݷS)ӄ6yhF3j̛k;]Yw󿣳hNBXwUz%S?f^d-&ҺURKnZJ{HRA**Ypj>?Wi&W x,lkiɐo8;jvswGnqQzzz}ȥM\f~鏻I~nR%^Ԓ)Qֳ]g*Go0u] LGE< C6 Vm\ ^S΃ϋ A[]MobEuWeg6}?0Of`q0h ķE͵7k4 Q/'+˓A닿?p*Gm&P(WEdb[*+JI.PťHGz H-? ~G"L$ 4(`Ʃd4D{&PA1)}&. ZS@# y76C3%($Ք)ޥhIKa>š~uHbɡ+F1K j<0uCqC"e۷,ڇҾ2wDeAԟ_+dujҮHIaŖʶeeSC~CpF9K>b ?d̿s#ǤsΕw)CHRZ-a=-b+Wcj)Zkn%bvGsy_ω/wߥ?W?oG!`ό;7 ;w6%sj#{FܹsW,nwGAz?JSiٹдY Ui;R6o4}:46zWDW اЕa-t*T)ҕ8W`sYᒬmXR]]y'ҊJ?sgUO+EOWRFW'HWŔdEt^Q\Z])Zx3(C*^]w5])\RJQm*VDW&Rn5K/ E3 }|e89͎_p}Y: -%]ǡvYtcCOB΅ѕq5tpj hJQ: ҕu^Xٝ4ʧ3"VMpjHh٘ 4J__] ֯nQY[@(}ۣIgTz˿|_s. o Ƽ?`prwr`.yς~!βa/n?+no?8zqO7%}'>U~ ໬=rsOg9?"?Djja|޿ _=ai95#O/#e8 e 41;#q=7POw?>3q Zv#2;lşԳ8bYef ;S:z>fc*Ơ|!?.~q꺷;~ݗ@]].޴1o{%X]6%eq<_$& >Q[ ɖ\p3-@B9Aob%R\ũ&2ǥ8a$&J-$K`̷l0B!*Ɯb'rdld?'{j%R!J`$bd1 7Qrr9IX?S ˕o ri ڊu> 1D&pC: %%=pN1#%$R!sȥ/Ah^PsH sJcEĕ1Zr6>'@BFZtoC.k Fc ߚql6dR)TCXjGQUr#QǘT[P # A 04"$~$i^9dC̕l)SAd*Y|,Sk|B吻9 *`o[ B9JjhR4{HT5Tc'@ҬRsaM$)w9$5njG[Ghsm-*edT&XK} R@i%#CR8'ݷn%Вi< Dܬ}Z]dBT1l0uÅ*TnYB Yd|n` &`֙%CâXtGlϝ`Ghݥ әP4BZ%76 `9r(-!أYԡC[BZ u9xi@yקGɸ0kD(Q\z,d0)/`ǖڨa\LE!%l S˨.( Xv4giF] CBp;|TPyǒ~j!!1\8D$,s͐(շ6l=x_ZM/H<-d3Z/bA^f NhNOcEUN=D#NH$9`b=ra'/wz˚Bwj^^ [EգQ L0B$hX c{T^XTtU2GH}1;Zm*d`V`yG a/,,tA^xϰ ) >@(&=1:2؇8˽#aӥiyy@{:K`r!Y#e"ԭ:hw3xhՅY,TG׷ßh{Bt a-%@rej6FcCZD +) XV̪%"VF/ $-QXyBgPEV90Lt!eO`3]ڥ `[= v6,QKǴj\k<Z 2 P/3ft4$iVFQ5x7% <%:`O~@r8i9n?T3FB?ܬE.WsTP=`˰T`Hd[SAP << X뛷ec_ >.`E_h"u)A -$\FQQg]䍡"j #g(U[T1@:.>JHd`8k)tQC`P'I%n탗`@ \4 j紹W )TQWUjKG``.VFNJYY $~m*ET!2.H} R - Fi!k]`!ީFq1P aKp!q=٬ŭ&yYz gKE]sKcA7K. 9=}j~oᩗ5TuK lM~^xkٸ )v$bA\h+dh[}Ô,|Y1(Ofs(d>? }.l|!hce1g3,i>mwb[8(>hۇn-9uy M[YxH'Z)#sP/>P;* aKJ Xw@Zf;Hoƞ@ǨKFJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%*,&v]RYdw>puw h+)R o<)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@ǫL.)WAG p5Z}J 0I tJ o@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)W a 7CJ sg@wF h!U_Zmśѫ1.5V7O@ci֮/d1A^-lKpLwGpp *w%@i AIv+̕ ]!\BW֛C+@)81ҕR !Btb+thֈEtuipYL`pGG1iލ'0c$%.ɋ.q?xcfY1%(>U|z6.7jl2ʫ6aNWJQi_Kͻ=' _Do%pz'8"75?ó3\rz~G_=#T X_RJY*xSw uB){Q~ݞ4NjE,jUFDPwd:| ]DU]|YCb%*F_y9] J\iO p;٩.vzZ$OZRHJU,uijaTt. #؄C[ .Uɍ}jشW/ӯ nOF7GJ2*h9`W/PL'5KOz@(7w߾ݐ'Jۡyw]Z@pjA3Ct%at]+D̡[DWCW/JWpTjBw׶UN]!]Fw07ݡ+XWJ7:]!-U/Еq.-޸β4kNWRѲc++]!\ߙd*q <47+eֺ;Sת5_{;]!JEtut2KZwL2h/dopJ7zc~C+Pkxtj{Е m ƵW+ldw ~DWC+Di51ҕTrkVRB3nzRpٹL9;DػдdqF~(%>BVFt͡+th4Jsn]`'Yg *Bk.3DWGHWked B3sv?Pc++% ]!\ՙdZ~t(v$#:Bٙ+u+:3Վh]!JJжS v;ծ+t兑:]!JOCW]/{Lx~Cm Bk{R]=J]ݶFu]+D4DWHW{h٣1]f+Z]Čmi^A{eL٫Ot32gF@CZt)FNt&u&FRz(5GbK#9T(/h3Ъ'8̈ =ڭ #1&;AM8KزU@ l{&'Ǘ?zr5yiȤ4^ۖ)S Lkgn/*u'1hOW^5kkgc(0i%MNnjei1Zi Bny>Ti9>^F?N]u 0;⟗E^>y |WwC>O!\ekCkϼh>uIq:,'~߸ RJ1SWU06)xwy'uU2JB*)eRZc.74\|ޟij\X wv0F&8Hnҵʵ~e^*aj.A%f"s!_v>d8[G9B,apsbyhz.!/_;TܺaXd~lAOF+ֽmaxyiŠh?ϗ >ov.ezB; I{rraڇҟaӬ9hbCǽٰ +;ԃ>ĸΆrP%/y$Psxme6 27$R:UY4WrU&{΍,Up\uyU(\V Tud=\WY߁O'y=7:_y2͟=__ýqz\ګ ,>;+DkN7[oc7>~|h&9{o x6D~FmZL"4A6bp\L&4%$(/zk@`1X&aRVE ?P|~6b*هyd1 g<ۙl*;,n-g=/8Zشt*u]T )$t]y@Cnfq′8qOYMUbj͝:iM^y˒l!l! bVe3C>U+ĿQj!g. [d Rg€WA!&k6=Pt++f:I'߮"dyL^e~^;Sh򮿣[.0謋I3JE2k4%Չ a! EiL:EI*]7C1Tep`|2lCLރC.jMjY;aUu\U;trn#rGFjHk ܾ<트U;"g71i;3*ltZi _5!{z#|v&hc%5;"?ڿtGRdNjΑ <[]D`nǨspu,%:HHMqX}X PdmU˜]vR4b+%!:@ʚKD.T CBZ7x8;f5Vn\B|t@w;'3fW(*>V3(^Eq }.39xx3 <3՝?^T(-3hŁy&C uRLX5Rr{a6˧7ȗsu+"cʛJ1M܆re[<7j2nެ'!",44\tկey  YP ᔵ^EPOfc9k [7lX8qG݄Q2RW8?)34','OxZ)O4|Id<٭ɑb<@{ Ʒ~<~W~ʚ]Ոv+6)' ?ħdb_Ujw*m mBa\PSo_ӏYwO'g @0axg45qYD꟟R M5`&"NI?}?퓟^½}O߾ ~CUoN~ |Vߞz󪽸U0__rcjߠ^;zqi˺x q ydwtмxi{hPf&i~1?K#TYy$ #ҌW"6Ӹ9 ˁbw!K鲷'79Ex"HzaE.e=: !|mN 㹓Co"Bi?][oKr+zM0V_o$lI^rXT$4)|_!)&5ƀd쩞.0y$V{ox6M%g:Uti&Ėe |1t| ml/9y(݁rg(σEPjyɀ9s{7:J< Nʅ&*|.%E̊GG yty{ꯇ@ " dr^b.M@/GXe2#\ K4/v@]UnCef '/)$l܌H T-W} 2P6j1}arc"W[OT 0XdlC+;myw;w+TMj/宨fQRa ҋ/u]YFX+@^4,o2YC͙Nc|aGy+|V&!NIn"()fCI2k Ai +@*YH d]2p2P YkCF&s yN=B}wd1%]x3'μYJ잮\i\1Y}0ݛ}^ H٢7`RoF!י}UFr}tߐ Yj6{vH`{& ü;_-/OZ\?];To=ZrO`|WX/8㴆ݘG$y`@c.17u+8K+<.,ql.Jf)u\,h\A?HF~6=V w:i^dcZhߕvqYyx-ٮ;]X<*dron{m8ڗ pcmv0 _0Ԛ9}2uL{C]njW%$l Rhm&\d1$t\Aтc\+52kZ9{*ʄ \oޭ&;`+A=+؆׷kn1yپP14d7k@iߔ[6igi)1`D=\u{! ]/ Z(Ew 9*JGV %; %Ӛ[U q@ xGoŃ~^W|.=YK˪%3Yܙxi79sʋɯon|^FnrcU˘BT]7 $ft.5x&ys:tH'AvY٭؊]v0a^DŽAyOA,ATZL3-Wʪ~ydȇ;gGh9(ޣHJiw JR+i$dXhF{:ř&hk O!Q' LΧˆsx@mHOj4*:O$LZ.}x3=G*;vMYeE4^:M4R-A΁!f%1eDa@R* [Mrp$~ޞ `2FD 2;xdɎ w*7[%r/iR/=nn5.~$n\,64f9z8''&tާNH#٬i=G3~z3ot|# |ϓB>,^/īB3X;\/=Ҙf{YnL ޶3x6 laO+yڬt ^Oٯo.e2qXm e@E0Wrj(81p&p&p&ʫ0V.66Z~=,3T*hgL ^1 dki"%XḋH ()BynrH1H8gy,tlN0'jr۔5H)s<}4q\| 6rCTI2Q;O-\R@$ZPAx jZC#ss֠ *qO6QB[wyFъM}f'');_5VI1lh ilv&#q.2p6Oݞ e3{ ꛶dLDkh&:N'0svh2ygdcRuɪؿ29ϮnJ}G?Xx#9b=n͗%!/CJзW糗YBek!gEQVG[M$I%L^Tj'VsP I1h {!Cf>sBtAG)%٠GZGG' z;ݡ ɇ"]We6EwЈㅺ)*5fO͒Xphu3aN,2m ZdVġuX SGLg8N3hyCmv1@$d$YR]IHY7!4h*u WAs ڥ8._Ys}Av0$c9s>er nU( "ci28%2;)&A;uZtp(`)j[v#4Éj("A+/j޹*d^H%PCI:eI'Y:4i*ȼ?&E{:6EXu_| xt3OO7Φ_/G<}Mec/D?/ab:YVxېoprO ] w%e8H?_ܔ Kpq|s_1=+ ׵t^|gӯSɌ#FY&-SsqxJxJ9v%XH˟O67(mxIZs5ݠn=c=9\1%힚_UЬNWS֒=ߖ`խz3 6ji :G<|As|T!h'mK??&o|?ίs: Q_9qPNK]fc}2ZOLSY3'w}bVf妲x"Qy{tt7J춥WOLҪr:ݽT_zwјxf,ndUUb65 S'X˃ i$'׊'߿^j =UT/S`f.epP\)֬fJ*Mb;{u/u -}r oK|Wby*]6,|uQ-2hZ_,?yw9Z~r?{f7tYk4^C5Bt3qآy2,vRW漂8Ly΁˓e>s벺yqa &XݼU`w;:L&JuԊfkOT0vd)pPa(R}9LQi&h,qԳ7"x1?]}G7::ud !`QtF0WC'"mD=\_NbZE}Yb$VKQcO '#i#i}8IUs)-R>Mw)ԝOrUƶ͂u: ӗt@UlSu5F}U3xnhyU}ns޻!uC?hMxߗp<{lK|.q΋۫J\EV#Tkʜzp+LT(aM7W.pw$ܙ߂f:†JKKh`CeFj :rTؒl-j8$%yr'-JĄ!2D1)X_)%qV6Q&":h8ɻ=) )E/(18-dՙ)܁凨ilx]Ad)*g'Vz@pH.3X o32ՉT0g)TeFzv&Ae*Xن$SLHXhCPeZbw%:O[7'}5t]Uɮ9mDOrڌfύy=䬹HyG>&t9dO957]R6ZXJc*0oVjbFTpIn+s{bM2@O XYl=tL*(eTdLdV錅͌]Y(;fXx۫ɳ$E3Ӱ8Ƚ/r:{OĨ$b,1hg0c,8Nt^3S5Y@7"#{b+JDl|;IX^Ԧcbw&qb(\< @}hkQsmќ}p\9'أL%mіgkG| , k . tQDA)@h=l(RRZi Uj3d;lZR.iAbpr[)z+Tkeq*%UNWF tAbpr.WV Uj2qe6 rZ P Fb Ǖp9QDǮȱcWeG]mVNe WbծCO1+W X3V P jw\QLO/\ ~3QTV_/rȮxQn6^NW'LTD^H_ Wj2_7`p*5kTcnۭ L`*T1FiPHJTɀ4͔+-W(WRpj}d| Vpprnt+Tyq*"|Qp'++H)B1;Tìĕ2^@Sʵ 23xFj  6\\KL)Buu2^lȥB+T SĕLᣭk^P jw\J\=\ ^=>̞la\m'WJ-'8N%ٽ+䀫]ZK}q5+W(+PK ;Pe݀Vfvy\'ͰWS7~i-8Q<';ktTVQڡ<|4?E>-1k .qAZ]6cMiutj^-P-##Kl:J˦, k#9iwzE|0 ( sO iN?^~-."DϔV{_q39ΪںkFu9&kdQv1;(K.Ko$f^'(50ʇ]v ۨ% è&V Nbmu09&k obFCfZXjmUfJ$afckz`wORYQs!0Z3 RPhXxʕ5TiZ8Ђ QP [ Qw:NWVU93 .Z^ @-?Pej\k[@9z:bpjmϙAõSĕJڒŀ`x1BմB|s22^39)'vri1+T+z+T9;I\c JJeBbp9?Jǎ]m#W{d\m#YW[gR[J u)'ưp- JS P};q \`iդ\Zc+Ti\&='EY ꂒs"BVlm98i L`4U\iPkx` l bZ,W(bpr rw\ʾhp$lp%dG/в\~[TzpOWJXjmA+Yput+l,W(s" 'R :E\.ɺJbprM1'Vv*j?E\YE.fG\YLw\J޷G\+sBh;(oaxؔ&>S 7zW7Б 1tc_ή5oĊk3w:H2kg~9vtM &"aztx:vuSP&}$d}Kë&uPF_l荒͇`(MKfera~c1 ywl %WyVeS quL.Ӵ|yU}YM JI䫛;Wœ Ce/q5Ag\яbyf?ߎwy{gr.&Fp9e7{wEUoɪeٸzfЮp`7жIOtV3LwSw?t;'[ fء#?C묱Tr3kLoau)x ,Zc{P@/kQ~XwjalւWxA񥝏}wFp azCwݺ .,}^3[qV݈ROnwwP5GOZ!~GU*58;ZJrW!fثE? Z70l__jtTVQک`>-;iSo}.N۵%7RB5 cA V#kϣ׼._g^-pZtQﲊ=`;:ꐅgVpk;ns'caeˑd8=i5#a&TխsϩF:Y|ҭtut؅lLަЕ 'OWJWgHW"yIf_R/<]u_]'6}zeK?< .Pqh ]2Ӣt]$",1ЕR"NW@yH+]]YyW'66K#SJ J75v &p `g/KDwZ'OӊWUy4M +%+R hNJWCW([^ zrBX ])ZǧNW2ΐ"y~At3fpb15JQڕΑX,0bJㅎjN9 6炎8|\[y)'q |Nb+ܗ8bNbbvOzj Xɽ1O/s_MY M; RhZ>uV,+M!M{o w`]T:])J^3#]Y V1-Ko!~H'o]9UN۸i1 w9tJQl˾C[-{R!.-S+EJWgHW)8NvAt), *+NWүcWHW"%ѕAM%]֞AEV3+'5=W&H/>8l_xp˪#cLՑ(S<)"t]s%GW[ ])HK+E+ttLҕurxMe*cOL=1ov$`YMnŨJEӴ-.ۥЕ}CB]Yoh;JSM:2ze?/4pFVH3=;l0PU9{TD%nD˽6TI Jh !;Y rQ%Jb$ZTmr" bij+E͎+PRnADh9{n :3qX 9Bq1Z䫯'l"7πHDًIoEh'C;ֽ`!lCt-2T`"?cJv44zUpxT=G9j֤>4ږkC H%5SCk$#Hnl g@x>'V0PYb"NܝϩK9equ$~Q֢绌)k,KՐR A4Y3GHSq.غ dci 4? "YbuM㜨5n8u)qJ=7g,\8"Lma=ΈfY24,REv4 |p.@0LdMf˂whBZs\[  BEIm>`Z 4As8ҀO+x0HP>[CXbӪHeq2B}k')[:+Bz#mP|z Wn< FR7f#ak[@5h[5 d{{C\GN%` )3#\-"<+@ dJM6#I61JO2'X(%%n(pJZ2I5Hూ6#L% Wѻ. JK\ XyfH57ki; R̷!a iNB‚HPѡ0T8އjM9%@ i_ `ɠ!\ :o¦:K0)tn7 ~֚Aނ(EzeH_uBCօ٤I !QYUDI)1 2C+=1zWCB~}9#1 p`l,@u6 b@$& Ұ:3r|p&; &O L]'A-Z,Kӌ1fPTTvND0BAt4` zntXs_o:@E6ɉ/ IWsl#eBM]@Va1 'E$`|YBE!Hd"5k+z3}*L'`=0/) gF!YeԭZHw 3DhGx[`xV% 9ա5.U` @vV/#H b!ҧZ0~Lv?ۛkH'.!OnrT*vb zBfȈd2"A]"a bVot* ^R@ | $$A~,Z5Kq[sljdž@HH 7 Pr+PS`$aV2@@KtAK<kIȎa3!eh (:[#f2FIp)a!/BR9P.ϰ[VhvvJd{@ww `=,0RJA = >d=`\߼YOpP65'V$ɷNY B8itu>]q;Rd0B1 R T/w\rss ֭>zuvlPocc.Br pyOQA5nhIA/3ts~j?y׷~u]|zw퇾wߴA\Hqz0q{__ &f,BQTw;嶛xqSqkHˋ5__?yNcs:Zӫ}#˿̧ׯ~swozsk] khpdcw븼!<|i;Cv>=_|W(㶬/SE6T?jhIҥW`Io}(.Yp/|8ݑh֋ogXp~]pu tW낫uպj]p.Z\ W낫uպj]p.Z\ W낫uպj]p.Z\ W낫uպj]p.Z\ W낫uպj]p.Z\ W낫uպ/]koǒ+vq=R{ws@]$$_AO?,p(EV z"%rBbYi6OOWM+\W/*tcYwWX֙SK[WRJ\pO\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE\! Ki:#BwEph S$:F4\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE\)Jq &;#RWJpIpu+ q# HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$کꮂ>-Ve+zխ!jf71L FJ.`d;0;0=ܑ>a[!Rժ+th?tB]!]9t.EWpBRw>xBc+o`CtMw zRf!rDq_ m`6+9cJCWT+ +]qv=!Bt.+th_:1=]!C^>ͥ;]Mwn5aѧ/G6yxPӷ-^FC0T(͛庉fҏZ 7ܮJ;0+b/j2 ݝv 2u3VaD2o{%k|3JzK/?y˙/ȹT! %Ebd ϛ 1TMrVD}!}42Y,6&W.'ocQRJy+'}L*A:NRwn@pE/nPeJ"Uk_ 7mޮ/;<,:'yk,*%YժΉ=rhb++)HSWRg޷+ ۅ >dT3-Ϙᚴ_~moEKg={3:3 nٟF A~Fý/=ն:v_A2} D=ے5sJ`+W8sXjN~RHWl:MIHgIs-a:dvKR;! \љkf(#%8U7, ]!\ޙ  QJOtuta."A;bk ]!Z}bDi):F2J;DWϫNμG^:]J͉t@5]!ڗ+@)!:Br!ngmٮ+DҕN.]!\<]!JAkkvC9 _r5"3@,EPAOԧ޿OmEyVpXTjtvzB@$X!6 HGNg㲂11!jxW)#55 E\-b d\|}:յ:a8^GsnY1R)3BXZ*+?1=A~>MkVl9tQ|<7 R3e;ln582,,pC9yM" .w^1L`|@,v+S`:^k^(<6oo8yzta|:PyzE8(.*z?15s&+Elq `$,GѧyqkQ :L-HtgDOCׂw Ǵ-3AGŊM*ԯ6$(Y'', ^tZU~yTt>\6U\3omrIszs1|>fӸM-_A4@n<;o6g _!zi<XYq={ 1a<ĜU"LE|eUt*x]yK8K /#pgay2%A2$¨bE0ڧda!䘒q\s-SFUDNKs"s8q\.C](!N.K54f /L<'[JRGe!̗B \"S)$xfL/K0W4יZ_X8!&~jFM+^3 C<; U'^]~98 WQSUfIF }\ 8 ό\Ź_{A0S7{, z˙ēDj>V>} ٽڐ=J['T͕r cn'C|} Bz`$|x?Gтqa,c:g'@yqSNiA(w5T}( ʵߞuݝL44q›OQtÓŶq9j|7QnXi 䕧_3RQJa<=Rm^恐{#Gx+TkTк~-4͟]mBf̶?Pm3vj2mɴIΘvW?rJf^WWdUK`.„߲׉pនl{VB_nݲ'5׵:dacQv&adZo2J|LZomfjb+nUoM &,{)'oNOx]mkU0Xب, *IRZ'%3)_Cd T+AXrޝ8,zJZNPEybEdI܃3{ [i~Z9.tW9}t>H_vl5ے5sJ`+W8sXKhgt]:n]TMw5lG7~<~Rd^|ƇDS;gUQ%H:k !_oLu _L]7׌CbmnѱP%]Tx}QHCzLuP"CKD~<V4C`s;(RLA?{=Vc"AڧE_ ͗Yv\4X7uN%]\12YgQ̵/AkbagZȑ"aH ݹ,vq/|Fd#o[eDzeJ3n5[M~M֓eFž8cs Qzy 99t`ޔ޹g1Vő6 T3P^i|HYyOkx:^h7E%c5q Z)erY,0p-wٕJvBπ<4I։jc:Tf4+'oK EƬΒ}S4$䩂r7:wDMBKbK7l(gdc&9)-7 5.p琙c ?{'9m;(m#9mQM=ih=9dAJO%햚? [ :m)~L#ꑵdO1C+n֛aNf*NbIUKsW3 Yrro0ޅ,R}h[`^7~Nty]'8Ks8~6=Q{+'K2Dzcs=X.LS# $mx"Qݕ<5\/>a[zuIZUXX Qb0'''nѱɪ*C1'fG82mIGVZ CJ̔7t+U䋞E_^4P# w8DTΉDɽ"c֑~ъ Kg=2<@!5j"W7s7z^]Q/=VN ?iȂJYJ,}Y]`?-<7>bW^/i}KXt}SN swי )]7/!Xs G'3-NyzR\߿?^DbGv"*5OK2$Q֞LMYe!g 7]߮d6A>(wP'"!]{HBW}oy?vY߮tm hja]Mx30Vu?]sN./j}ut^,źbt6u+3J#:ݖ<7vפrJu]^34;)gk6\A})Tנ2s9oCۀҾ*FL"M%UA^]ݣ6c} B42{k\83c~BGUA悁C_NȱdK;Vgw&FA.5S,FJ(K: t:R='#@:8ЈdQeδRPpE G;tR%PFI9Z˼9{Z4|+['υ 2fV~1]mחk4?,6ޫB-,ǔ0qAK' J;Wm*[K' uM (Ⅶ9 <)%JG2rň *ƑYĉTiq UQ!iXfDj rP"Ld<L1w: m#1usX?sUrIl>*yV;gM6ށn[hlxbq{ 97ѣΥH^H̀r'M!d>+LH+kO w1gAN,TױuS#k`=dLT6ZYRd0]&EfU"eJIf}9 XE{]'f=?wkxҺ+]( ~)kR!ΖFPq*:Ȟl|Bx P%#0,|xԦ7/Y; dzDH)`UÌk.<1`5hMLnĽ#{ 9 |lb,}2 ୍ad*ŝ %y5jٝz'g~m[ #ӳG:$i5n\-Nunm+iG`iGMi7 C:xUJ",W\7moHbcIl pjP^$ce=;٩㥴결8$%_Ua"!=hs #C.Yt[~}Mfyt_?zcNFߦEnq{wOBC=ٗkpJy"~T.dmjoս'tm:V:T_IL  )Wx- ]>+TZ>#*x׸1-9Y4X~a%e<4Y2``B-;R>y>ۻ'?R zuH"Kƅr&!63x$Aij\x#ڨrD+ YMd(wo19xHyIHJ"Qv"iAg6myڜDFUC^@O^L׶#EkH,)0V$ CT Fjn 0&9,{ɔ"D{. &Rؔ ^C4Fɶ#"JLʈ]ϦqMs- j 17K+mQ}pR1dEأ"AVì4I3@Vt4d1E1!5)Q1 2t9Oakԯ` "VCQWFD=  cLCp, TQ_.zZ-l*"vJ $ 6s-' 2B\Q+Q*K$r<c"IܕPd*9,-)NA+H8Dq%hTVLǠ5{H(GPf.IsQ펠rEG.|(|ŕ R EL_T1G>|k1L X?$FFϢ$ G2KǸd䊝-%-OnNK˓4,9 zcɓ]eW&0arзZ~U~}4]9?l)Ч?5nFW]]tS!@0x9Y}nO7\'l3yQ2F 0?huEw]/W|}NEjC\.yrqvޭ+Mg?]/ۢmvo]#s$Ptks+aYlLYOe.>Nn b䬴;GQד5ꎹlF_K|bQ}98~-"KZIN=4ےF,5ı駇ͧH6cQ/d}!r`ϧ|au:D/]^ C!Qciw9=6u3?Ly‘܆">W[VzFb]ĚRSER:ţ7IĊ|߽_^}?q\s{4?'աx7;=򬯿{7] |m۽ !L@\l}qs$lm;Ɣ\Eˊ5= )K>@ 9`#E Ě#{F.ͺZ,y!X9agm @1yϗ7m::Dz29Ucgׄy58Miêlge;hf,wgH]sՄ& 6]? 3-,g!5!d3JD-Gy QʲwK\9 Z۠3cs3Gbr9&Sz>b? Dl_gkH ;og8ث::;=>JI>(GȲ1rDL9eq6dgAFG;HX-z4=)tFzLj$G0@N] vtp Gϖ=gDULHmgi̝I%S D#iccrCGm|6;G'Uxr "G#/cOK⸥k0bv ·޺@tn6٥7;\'Y]&d6mn]cFSpÙw-`u¾¾¾.]pkpʧ6 s$bC]oo8i۞8]Hva"uYg2G)#@+E,'W#|ѵm\hv<3/S[=֝o.< |uwۙj~LO-g(2ylbԷ'0{$(63)\F@ϣj]zjMZ5}?N?ϕݾHkH RTחj C-u}Bu}Zwau}1Zd!p^Wذ!hCA2d 6MLjZhS3:S]IJ?燻9% a0DH4t>CD$|&8Z)$c9J&5{Yd g|wߓ>x  7lkDqL)yLfܽ\yI~9xe&ܣoLK=]wo^s}r~B [h-pNtd,Z;&BOc#OS_duKƖڜpHbJLC.SNc;v8d+[DC>s;B;r5zrk gjׂߣ*9UYΓYqΥg4[s"*̗A0r. 6R MkZs ɍ5|ƲsKe䖲ewhاYmD0p\"cjnsK-~9Z|9?.'hy xr2t>q=߶w|%[2z6SZwN;1S9ld+;>ZC߱{rcD;r'D];􎭘w}a?c]\\79:a_RsיrWL5+Zz]F j .9-u6Xz]Z].g.Gw8E`pjt%Jh+]WLiZ(st]1.,ZtE]uŔ)JCR+&T+ƵNMt%P|t%=>j wf$\?$Z8&Qbe*LUzh[*(ҕSP+Jh}*]WBjr,޿5pRn+=^n̈́gA7o̅CƯ?K{6qcV譔|Ny[~y{'iicJgH .x-fZ\ʭkjz9S0t\4o6 ׁ] \MTu@]ĊtgD'&bZP;SuHWLz+MAv{Bu%-QW>6ޝ̈ .Z?2%ꊂ%4v ̈ nT3vES|t%W]-PW)4gJp]1-Ǯjt|tlz80!44$\3/Fg4J4e & ,W+Gߕ[-ڹ2L,r\^tܚzwyt^k;'}{~IROx?/j5*3oӓ"?z_|d3y">k9l|*[1]~I|t͗{L4燿|o_>n 8B}0 m\:=†:{: /.AιN/nh0` (ĭGL T q}>ʏhsȩ`V\!4˲?{F66MKMo6)c#\7 Rr7+CͨÃH Ot#]i=C ^ЂF_ЂP֡-*dJFWA eJou)`] .g6ڋCR+@0NOt%Jhˏd UW bIStFWKI6Y*]WBj2D]hU|1Z3#]pi1J([(@Mb`z+jڅt] eu"Pez\PXՙg+g!4BIν4ܹIM 3EW(iqM/OP }0FW] \:'QUW ԕ ,i5?6oЙd]w;WO8oPij4-FB;W<U{ѴgQR+ɪkJhچie"uR);GWH6u%UW $ĤiͦƠEWBRbJXu@]aJ."]1FWJh):ŰD]S8ꉮ@v(CZI;=ѕ5̈́A -z=>C!]M03j.άii屓(цt'*V]=魵Dt)5\Zt%ɔ+k)m'JFS+&R+ sF;j?^WB jJ-a%vb[K,KYc}2= Lhdg6h%-[_^aI)^Z MjY2~X%''P[{YO {@tB89=6IȂb# BIu t# d"] 0F5\ Zt וDUW qlIY=cWAO2ֱ%*&]10^`WXh!+TultElz:0PNʀj.4Zifp 5 ]QC^vN^@] OJ(T]-PW{q宎52CϬF٘S;ڠ)d`gM .M m k)}Q5͒銁z5\JhzJ(K[Qu]h@XQ̸8-ZW|,!U]-PWMv+QѢ+ki͈V]EWc'GgFTpP yGZ4J(#V]-PWerGՕ'FWբ+u%[eT]-GWѣ+'\R3vŴ`JWp,QW SP7g!ИUj5ԺsDAu@r=RNZQ >}QܷbXK$]DȝDNE[RH((E?!Zc曄E9>%* X %$ZjC_22$1ڴCԨ*ς|"GfѺXL)&>*!kՕkʧMpS7B 37%>{,R.Y X4kꨐZkCu:ИJ2P4@ZE36XvKeQ[UCEEG |SZø:3m^q9*M%ʉs*j]vu_@28gC3cMHm̭Lt%ρDIE`(.1eC;u{E@M›ւ +w6*އY[tڠ@Dxq Vr:-h-liԋ+ׅiqf0蒑8=5f*ZOQ!,6\B\1Ab~x| pi8֏bEs6ՔWλDL0r(:fAj\ ބ$CL.Xf颀Ւ0[ U\ yF A{58F1JL0H5[Fo}s7o6VްǥT&Ċ&.O[c8-gќyոڎO8|p dXLO?ɍ_`؛CY%~jS4I-ڟҋ.i0նޞm߿ 8Fqcssőn^|IڿPC8?mW.Ǒwc8tΪxzM!n]ps]oW=r|&uDhS&"N >Ĉx|րhn|'Pw"Ncw@Bi $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@N H'=U1>b? 0~&@֪؏CYG8H@v? $N@H=#''q:~9:''8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'3tYgB8&'$tp'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N q=_'и< L`:'y&;T<9:bSqH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8[5nϫWtn{}׿y~lǛ@D`Ƹ4F{,%>ҍK%[1.=/^b珈Z٣׆c6/ ]=G _mǸeZ׼yHx/"X%jF_6Eׂkr}+?L8&;M$qQq6GXc} Oe2ʟdSnFv\O'NM1> `hfZ>edB7I5Nh8r}r];n9 [oo^sgV]}ϫk1,k!_7WʞZs[X]lV 9F;PEVKawO? v0wZ€kfia7z@,-<åqsLw29EGsǹ|w(\z>te)~h ]m @3+5|Dt5ڹͱ@[ק@:+77C\nkI) .Gw\m7/}l ՛v e?_/FoOVj}Gj7EVY~q]C_>+AߋY |حy]vm[ /vieՉvٲP!?Xq={pvٻ6nd+qM{w-pMz ~Bdɕ| w%yHdoxDwW;oCr>Cɴ+\SQm̃o$WRq2; vTGWFJeĮP`-33U@i !m,%cW_\#_=i8yxvuC'9fnӤ0pueb=\ՃF(F8 \eq \ei5m;\@GWٝ~NjٚCqNlav a!:$UZ})* ?ߦݝA%wx,-l:ahJOL y%֡%][lcUoVoՆ8|^^ܲPћUb{%.F?m˽+)Wy"bm6<.\NBK0ɸHtK8MJ*i]V]^̾ϋ-,.o6vƭ לG1|z2nev{y(YXVmxp~av:#%cw쮖r>Ttr~i/Y~%쳲X7b>t+A$P6A3Q$ 21i@4qrr-K!C|,R2!EIF/R!M9wA4#sWq9u;v(C(D +L,NV^O^3[Ӎ]4~qQ b}iYa Ow]^Y~r6KOVʟu;~m9YEd R+ܶr'6.j&wR6{Ϸn/]ףeD{ζȗBo9%y !JQ*L& T^J$ Ш .[lbx˼uQve/6*oȑF+R%4wM*) h>ZDhQ2-@qIoVw= :;j1N{c1c?ψޒ\d*وi$5Di $D1 ]Д^Dk1$d)(h& GP MBT1KhdVuÈhuXb YC*z>$df7~;L(,)U!;"84#,B!sAh8+0'hy Q^Ldyv!˭Fx4o?uya}z<{\ ^T:[.ؖ|K_-Z2{" 95Ռlpfg8ȁen݋UD"׫W d!_0c!cU.sA+4XF _+0?e2 [HDھ.ġx:/h+h5]1aү5,+אj$6lq Щ *`Eヷv!_;?7\։0dgU2sճ`6tA|)*߶kq-'PMJ5fao5uUy&l^`/2iAcCV(W@0TOwOS;꘢ l2>/ηc._K MVM̪?8C# B~ٓW|P_ze=U"M-/@Vgw࣪6M8^bV;zizz^3<avPNeue9mL9qW_Ⱥq%NH[rx}$ |]Kc/øȜXbhm$6\%>6l(66sɮΏ?oC l实Fj9t7͐C Z#=້ fZ 's KtTth/idNgWT2dghL'v (m`NekW8Rݝ!|TD(; ?DRɘ|q0{c/j=pla٪, !߱ J!9䓘~Gg}&v,źс %DKdWNswQFC J-OamK!=AlӇöQ>@@%? F3eqte Vgo72h|b3lhrnq*sC>f,ʘ7Z[ g-HO'6rsxP׿s4.KG{0NS8"c0qs2w irWܻez~Q~*#z8S-FPѓDRDF,XFRsh 1a?T%!Ji%2$9=Tp`>Q+3hÑĹBqxwI76Xf^nv|Xb?\ND"odYp^2zN ?,Wt8Kp.䚑׃^t0dɨϱ{$ݛRπÀ >u{[ r`Zxjézr7N'`Szh/nl "Y˴#3DC)1AX'-T:SW$SC6_X% i(1A^Y ޝ`a!`% wi:{3HEͲ[raWyCv|;;YCm{76O)eYNqk~{bڒel{@LK1{ou=XNJyLlD/I=>XD"1Ad"tJ xbAsC@ fV,k5õ>ָ/fax?:<&|D z_}1b~[ݺzD4,Τ?dqJCSw~Q}#L`ڡMy֟a7Uk ]_oL߼k%%2K. A &c]O<17zu˄.kd,3 C3OpǭT 8MOo4&3chv'SN$Oj;cSΜb=qPy~A % * rS, PYLR*(@ ]x.-1D$ecvfILQF!AsT&{毐*KkmdGŏ zK`A09;X,F<A=5qz}C]Y;5yآb7 i!jGNHM(D)O} EƥG6bQ"XژR7N[@ʮZFr%28P 1e,"Au,mR3C`[.T VH#;6 >ƜD*c"`hPY枅 %p80o ͩDž=zv>A.m%Wc1@7JYfLϗ4¹/9șk xNz3,n^R˃d9ĠJr%#$$^+hfާ-W\ fb)\ /MJ>%5%\>)@ 0:}I-"]^VgO8cp>,#p̗B LGQg1tʺk'0@7-ˤ-t*.ؕHy9ZJOp.\_Лfl22I /?iŽMO@{P?g BZBԆNWwUh9;it"iL6&E[י,,R<1b=nKԗmhwݭa&ӅX*6nUB my9)(tRId)H,Q,B} d4SRB3keKFr3QKRSش%٨$`P=̪H =egkmx>[=քĆz䄏,:`t:/aN,2̃ɬh 3"pXSLgKG,GŮ65d|'v4?,6SE۔l.:)6s^Z.*9\ePL(4MRZ&u3f4y:y\QE!ڽūRip3AY,H hqp.iCbrIb *=%62XcvP( %$^i̓\Z +FN 4N1nIUv"h 㵠3MRH' +Q\- > }j`P33x3?d{{3;3C`sT-tvUWg~1?^'EV9tɢM-6h#mq{97٣sDcyxP FfcoM!BV:e#*O ,mkʘ T&A Ȃ*-0r9L>9HEa$>*'<:^]{p鲖)F\w5kMhVF)5M1 =ClarFEeJq&$뛠AUV*r'`b>(uԦOEc^&+ J);1"Xrnff 'p/L.z{IG* B ĸYd@ЧC,J-o_BD`CeǮ8gotj*k 'lŎn Nq?rNWѶj<gm9͍hGbB ѹ6<ty zJ3hhHɰqFA x5׀$yf.RZuYRYaWVX{&={i=ay*ByiecO7cl zHgi1x2M4?cjǟnrk#]~lu21Fٟq`;= 1L o6V}R[=8( A*Rc,i<.p?n9yc50TA9c# k%&fԥlHt}s>;֟rȃdI@ x%4@x2-,rB_k zJK\DK4Nɺ&%VYXf.% d%F$@fd.35IOȯ"3iST&?,(`d|L)gB Q23O$l4Y.4hU4&` 4i@v1l%}V|E1A AuPay}8+]ӶG麾KfT}r(ԼS팶}Glxpk;9Aj۪T+Kn4Hgn܀mù֛v  72Ӟ7{ F Z!X1] CJ(1[t5]Ctej\tŴf8jBB8qpq׏ίC^]dj$tOWO:K ' f^~W],W|68_>+a5'hOr*ݗO?ABFhVZiM?J[4=EMFM9銁 +sѕ˘uee+6Ϙl:BוP,Q6+'Pu6'+ *u]1Wušg2R6|t%늿&+2> Jp]6̈́Lʠ&gdPV#FWt6CB Һg+echbQ>#whGWqeЕЕ-zlk}Fb`ѕvw%( &+pr}̇Ufvt2RSI~yyZ=h1ly=X}\t75]T# Pi߹7ic|/8qx|Ai>7+O>5ϳջsuƛX^:x]^YY'ֺ46*Ts;Uo,g澣 f&zk{ޜjk(*NqJ#NQ&L)KSZKw[W`"ƘJ>"bQml6G[0Xa"}z㰛@vjNgMi KeayupU۾2*owWoV oTM7Cw[h6Xvc%:4jk=`v,ՍڸwGbn=K<|Yף._/`fMW9b٫zoͻ_ePf ˗߬Vzww o.}hoK//UW¶!TZQ硙?zt'͋_%B&f3#raZ&Lm24;Ha0϶ҦR̪O:xl#7 {:'M3v>M .\4ʹF5-E4{ ꊻo+Lq`Zf(苮&+X_5 8|teRF+=\wוPPt5A]9 lFb`јlt%RS:&+o8HW ѕB6cWBR%qg%d+֔|w5*++tj v]u%>d+ՀJhO]WBIѕ۲ݱr^ Ǯp/?JQЕ+zlkl/?[FWvҕzJ]WBIj2p=@ U"eEq4`t6\\4-ΤiUNQ]09u8lt+ Jh!R(m*+$E> IJKWK:]1vJ(Cue ]1STݗ}.ڱWPt5A]9"l(`gAhK]WLuB+6UWE]QtFb`(\iu%t@6'^wҕкu%2+z]h)eu`Ò2;.Ĥ-{\L(N%X-*N=ٔEhYx e_՜CՔ̧b*phѕ13M'N9&eH;r7P?nP^%0-&8lwuMhǚtGBehIP+erkѕlFBu%&+YHW G_#w=h]H]WBYIq9Je+5&Jh) S*b+oR9Jm>AV+ҕj"(#]1|t%+%+Ut5A]H4.eWZu%th˦ce9BWQvpa`X(S'E6`2ѕ426u] UEWԕHW LOP`_ܱ]+ %,ߨ\3z 2;^ܠX)B6f\*VSJE4"@VcvLPfU QJ(]MPWAN`1n.] -&] /u2 T'(!/] .fZRS*+ojʩ3(ѕ֦JEWY&v|:kU6Z|gP(]MPW[8uq(p1vuwJv|tlp!= 8pȵhH/P҉7  EWmz!#] pIQ.ZH^WBiMuelfF'^$pP;QGHIWb [.9 K EϧSTW>]#9ҜۉiYRʲW}ƪRh2'V ˢW"SЂ,U6C  _> ehaC CHWأC.Z2)AuyJ3zpøONZQt5]Yf+.i1ѕZһ ".#]10>"ҕlt%SZ]Ʈ+/OgqrJhz5t+"r:w$jg܌t%Kgp RJf+: B .u] %o[5=c8=v#gW4hAHJVvj{]]m[t!r銁Jpsѕ:L]WBIEWSԕ[E!DM1É 8nؔH<|(\P#/!I{ͻkZ(MOPӨt%ѕEWLk!NP&]MGWVs3Ae ڠRSEWԕƍj\p&u]9 e]1-bA&+v9=e`3GtٕP&6=itEF٬: <,՘h 3-+D_)*ț&#] pFOG'IkBJ,cWGWz˦xp* Wwu;vvGY"*:BWMA#QFb`>] OhRSAZt5]Ԡz*L@^Lɷ;PJA$] ".6_&h_OKGT eXH(5 &FΧGT+Js)sʨf#qΥ-Rb3%h_Sb[C$*̌9A8Xi<-e dinhZh17BT5 s*X|J9J jB}gZ4A#C?>k\G!G馛z~~\qT8z^oǽlU|aa~{h C~p\7]tk+6٫yۭr|9_m"X-Vы6Xjy]WuNE x mחu.wEӿ|Z7:SUzaX1RF\7k^_r6#J?ͥz^[=>n TW.nf?ˎO7;~IaC |?H7mVݼYߟ6AoWw߼P@ԇR_oJ6P2255 of^ OTnUsvzHϪc[ -~6ђ鞀6\Bn}q\IFf˄⥁yn 93(%N۱c;}[_JcH ;-ǺXUtVA0iuq(6~okiP}{BcՀndo2YĘ$ 2K !fZMrisH[($Y@&zg6dIb́D$2giST㌱2iJ,e&diq E,hfҧʇXD1L!NAq*wv%=}+OZSea+Ch:6| ~zK,$ū\Tv/ɿ>cq>].r=u {v4iYtt=ZNU)W;I6/Z.@ %%ZBƟzUnZVsd6uO$DI^`\ܽ^JGt}dFL,0pW@ d/RNoPSvrޛ X5^xD/~gIu^Lfx¥ Qsma<.ۊ17#7d(,і,o~̯܇WŰӫ}SG!.y)R 1H^%|UNv9ȥ{r.! .Kn7ղ|+thv)YV"r`$IN,&4I Es8)'<՘$d $is\kߏkA+*7|iĉ,gkO_GXn3-7`bMbP[Nt%Y>&l)632D%t.yQp@5m3Hg<~DuJjg_G˛7HEqy ˽x Kyع}uN?k=Ț% )Nxђw/恀b,T_zYӴ¾^F^\!OƋB N01$hKhN }ϨT!z Opg [tjS4)aIQ9͘T%TH/sfyN\*\PpViwfl,?e2Lȅ\kD$K s.ZX4̳uN}f:K41wNp!$u8҅ )cy6?Hx-8w%3,U&M/ō]֡Y,.ެ. =p4 Т S};/c,kB)&@{NRP<|tҊtJ2/`ʼnTU|x:׿B w/k%Ɇ}}twưtWo0&=yqP՛\gJ{59Ys>}9*>} ؇C.atNW2:g`r]V?:8Ðy`g?yO\ҹYIQź,Lf1f$OV ¥ ^xb;=nGKwso=tO|${+yI/!VvN·ZlQ z{,,+J Mshŝjoj\lIoi?`/ѾT#/0wֈBPkK˄!LjՋ3${#zqpv^y`(Ɣt~F 7vK6}p䦽TaFwΡ,a;&0jrY]25!M|5^yɸ.ң2b/`f}eN2^ d8e%.w:b"CSb;vac{fK1Ӫ'spI =}0 ˘J. q|gIk1a,3)Ґ@7&K6y,Z!ª'-~`2P]j%K#.f[D{3iOEy1}10Q;Ő*ej6[eeo/`Tʾ5;8`uw0%eWBv1R:}_\@LxfyB#}pP^7~@ԏ3umfc2h,]N;Pz.qX8{(m%B.ɛu rquqM͗AttVW2kD1y\-dŵi~,2 ]ʨk_le}q?4LćJZ/V hK^$ϧ履Voq#.%ss$!:Z&L.r޽9<.g.SSCSR=nޣ&ɝ~~uRrm{xyR@`r^"d;i_*+[; "pK\ˆ{#)?,E2r߸hjH0Z%qyŠ0{FICja}FҎ?hi*EjpOVPِ̧q $@ dStAEQPcAE!;q&Fxq`ЖE~Or6yk¾fk(2̥(1t(Hp Ѡ8)sIMeNލ] KhT"hLcݼ?M:Ӻ%Mmh]ՏÝ#jpgK0QLFZRB!VX)U۠&G!6@(cEX޸xa0H\r'=Z#$1ſ猪P@ꉳ̑^2:?@bMk~~ :fB\ &QM[-km @c"h ){ "D9u8Wt;Z~oB;Z"FbuPu6=fȶd3Jb D+)& #[\A^7s~/AG;Aq#҅:fB)7ݛw [ 4#^ZR%^@'9qs%h9}L6> H* i0L5 l,EMF+w=ki@kkQI&dcN1DvL8vWƈj8I{ҳ-[D~-QˈyOv+Ĝ 2lh7cRߟ/]w7i@Ih-D̡1䝮(-G}xӟw쿓I}^:˖r4Z$]Pa4H1#"em,URq+C.%œwxiO}kvt5rӌL+*P¤BfD knéˎn)D{;K N "$.gLRtZ`W`魎A=בT._YS|uM1m č,)FDD*03t<{?XQuuQ/p;YCmЎ|Mrevf+$2*i+o37:}}`Uj^ǚ{3p\m98R(4'wTTG )xٜM"-*L>(g4\kz- ,;b*qiM0BS$S9!$W=+*ܭmRR /4޼չ-ۢUt)r`;)^g~,^39&ɳbn=mJeTat~dž{+3=}vB :ۂ&(mƊ>羈ǪU#W* ;W3QkmHE:advvt,x5Z)cZ"j5IQq.6]]Zh$Ňyk.rm ߸F(V= Arv(ʯ}ӳ,;{ GA?s2{52nvB\sӁ|==Liüu]L"(-e Q# (NfiBTZ]K1U PD)碜ȪWSKOfř.֫M<2h˴^vDh5xHfb"Rm$Hidda"H0荺ABX+gl!rwև}/,ꦔ*|5Lt.8ppN ay gH?:+ dA+1Uq@Fs+KQ fU?*D^/9?< j1_D');(lĒ\50 7}i`PR` 1JV zi QU -4xbnHF6$}@\%weދ#}9 LF_/K aAid4CqPDAUҔ Ch1Fƻr )#x*R%SDK/eiL%\Y#W#cGMփ<af6{thҁrvg<2"C:@J7B bO43<6?Apq*M- #v5=@IʡSxvy4/ǂX0P$("k5"Z%ARvleRih#=r Ww?~KSj,f<\|B~'t|!\kGCHE52N`~w.RG1ZҌz м{#焜 0\9OW=ZtYz6uPmzʑ⊬{)2(1ưR+v92 5ANJVr'h2xxJhރh :{j҂$^`o:+NI]ZD?8t:q%=Zh82Ha1AjDb+BWYe eny% u2lj8\}x3QuX嫝'dMV8y~-̧i}-hVLձqPߎ(9;#b4^۟V$ͫ .5K廲cTO?z,~P ۇF ƙzEA)dG&* {7"}pG9"^n?nGEF5z9l376]>n,{؏K4xHX5kD\?{)Br݆.lκsB9=Y"D(ȃXCrmB?|<ֿW,ߛ앃6>j϶eb9˧iڙ/m)L̒|=em 4 a"|;L4PiЍb6][يNLwѠ)OJsME]y6|lpE<ZILo޿Yg3\ U;e~~5?6Rca2^&hjDs[ڔөy,nGN>~nRVЉ_@J"Q/cݤ kݚ#A7n`[ŭfb7R9mE܂ZTY^#0R"=yiX/ V0yGEѭeScs`m[xOT)NvqncFVwx>,H~ [:L7f‹A{cre7&>R4{?p!I/n"ق,#q3@ @&(/_(m'7_z*Z?/9nXW~ Y& #iy\l9UED*-]憴yl>@e^QiP(m*BMQidG 4>k(*Xf֨y3B;XVE)rcOg-W`tlsyhEl;w-՗/ӥq] fC{f6/)M;nl>{/\F?f2f4~B\N0ӝXW&;Btek˿l8߇;}~ҷ IdOg;9DQՒJZQX3\Uldc=~kqX8Vjhbq$ ::VE%%AVݰCSұ5I{sSE{L;a$SV%m+s=^%K⻣yK̨-9逝g֔;q䬛,DNqӄn~[(7# ucrnx#2\-*@"/[V3(fȹ*xuSY}T'B΅rS K7M' }FFFK NvޫBu}G|ԡr:U '[,V}(Ёu˜ȃDaD]peH|_#50EgcQ\u'k+tg o1M_~ AYǖDeFN:D_q=S SjdL'l<ۦsIu506`䖧f߳;_op" Atlk{xE׽w3CJU^41M_//P~x4PצElDڀpLlHxĚT!\9/Q!UVjQcn, (Q!{llGzJ5ekĸ HEN R-ۮ]_.]׿m%00͘/֩##̐Ta0W~w5dP-êkW/eoҒ\(!(QIR4qBdJCXwb*/i "40RtN*Zp,⭿'9ԡ w[/BSie6}\n|!uO:,"m:OdrHCȊH {RQ IY yNmzʞި^lvg(WQ.{{ 8Ws$ճM,vv9w*]\VW\j T-dmگ9$}5/rg'/oOd//N;^3k,^8z8Cx$ZqJQ8XFxWlʸ΅9\ f*YpRބ(^c Wȯ1N7+)7^FƠJqļ `㐏C ֪ *ߢ"vZ6ud[ednᶣ"kJr2hF(ۀTcɩ 6 HRQTdYq0!IF`D52~wMaٲOd]A'y9̟0ͭw/ow-mg 5xgA hJIݣfQL2*AJ$L̩m8Xx`YM\ftƵ5L{T5tun Ҩ%]!h7b1mWY=2SB!&(eݽsfZO7^Re +t*W[BcE[pXu|,2mesAMBILA=8!C}:vtWV-428(9]Pra 0hʯޖUg"IPhOIso/PU[&A*0!6cإf4)3^nf3PIFư+)k^FfӂčnRc3˵!1kX>5!1HOB45΃$AI|I1YE*FS'dLl?c o-^`kُYm_ikÔ8h^|cXM`oۗ] L՚_&XMLծ< &mZ&u-'?(Ҹmu}QO>j U2u7PBzIRbVh,u*PA9^|@ŪE hj7Ow A(3bZBY)m, sx#Iuz jOTHĪTZtqv0Bm7p'_jQM[H%nŁ=7N#$5}dٻqWTSnNTKv7$Oy "mT{XDIHHY33,t74g4?DQ@EEdIB ,N9FEoﯴVvJ Ooy {.7mTu~cڼ~Wsy cG*&A! 0?+49k%v ڐVI0m@Mj݆#1!IRRV/St@=A)-"qk'fgp, QWU>^j _"3^3Ku}G>bH7*Zrf;^i8itb"!¸*{U_߻Qp`$0׿a>L W <MLCo(Ch{5(*7(=Z~umjMhRƼ ]ca./GV]qw5R+6ٿ M8 76 h+ wP^?'A u9Mf$GWFI%o8@&=MƙdĦEdV.1&=)VX@[PW/|3+v~Cs]p3e2\un?\I&fJ߶yw7$6.&7^&ksv7I;R R]ԜۖABucWn3 pTͶY`T^LN-SU!:lReF+X@}Ib>Hv΄,~ؽf X| IE ef2 &9YT8Hd"]H;nSJF)PB"Kq :`8ֱFd4NfB+97i^٠o Ha@%#F1qBRa8kp^ N'mb&!!X Z}_!nKPTP2& Q:s66bܖ װ,$Nƃ:n24ZH+W0a0')c&d_) k9A#)& Qj")LȌTGݪ>BE#4IW#$o򗊲T+(Mj'EW4k"GI$li<.9gc8n6{Qq59҃GߜvhK5MCݤs'i@(Aܳf=OcʠPR),1IT:2M"9#eJMًOH%Ai8297}=uQ_Z(9CmꏢU`,(gzSL YIN9@cb%&  Iߪ6Z6j:bx5h- 9+#KI  v%j̝&+ex v\ }fkzڭ Q,yY6T^ }Uv kPǾ?\ 1"G{,d [-l|r \6`V=bdxh%ƦڷW4vD驖[9)TO6foc0?ۭ.kIʺ -!`BO] gR+jjrCQ5jܳU6RϷ7EZp rah܂0<qliĎ沢jk!2%7m*<MÕZ\XPh1-M+̿?)hilxʣ R[ВUM *Y0c%=7ǏƵ0+NCH!hy]H(m-lFr6e'qءZhH{h#ґ7*J~nibB[ɯjwg[)QpnzjB'Tح:] F;m#+GR72]QI_̿c:V]Epwb)VW^IR8 Plkv9KM=h uUj58mt˫ZO I@\$NUAhh8E`F}luK{US<uP,zT ]Lmrʾ}NTuA\ogVKV^ip|Z#Oɴ|PE V?)|/Xk_L T OUK)/?-&L~-,lܯrE"R_vKn4yiXSɴWj(<.eZ]ɯMff )/Kƹ|/rڤl𨦁3`2ej𓌓OO򪁤gC.)`)s@߱Z1ψQ`ҽNb: peYjRn k^f+5E5wCs0ڊD䜙M= z+-O}-=$Y;)k3^|h&cC M|ЙDvH t$o_-Xah]`>MnI~Z-_[(E0F=N&5NG# URazwIsN/?|2O_{ciE\TȊNũkAUǼ䨧7{aq9rɻ} M 8D,Ҏ> IW)V(7E2;;㒥j^J\4V ۜƭTk\Pt/ߒ:53N ~ߨ+[g*;Vwlw;vkatיU@Cܥ %e$}KΫr2&Jژ+ nF p\J]Z%vVX X;eM^E2B}۫ꃡmb*L+N +%O\:vA[acB;͇i1EIzi n!UxL6@On+cU {Ѐq ^Lj⡮w*U҄&H}Ó(Cg$MKH1SP Rr`uJqheP浥a,BP6]C6BP ;k(Ĥvn":y?$cFuPA`ҝъ w -( |B8 @9`77ZOLkB]t[_!ht ,N^!({Əj[&~^S"wx94$C$֤dVHQ*rF.j>U" ZkH W+wORm@%?P,f 'QNT0J#S< Kq)X):@d2^ +E}a:-0i l<б)IXiSN 3h*}LV4v_GerɔJg( q4@C9%?I`VjZe2S@q҇Ifl Dg-D:9@]ڭ2Ƴ2]94ðn‡Qv8@4g20r84Pq[$"C81Ρ w9@c `Z$LE9S\-C-Z#,0:48^XFe#q"Rw\06pݔ&0GhIKKR"Ab$];$zf1yP9YhK-H49b?<ғk2%ؓ 9܀ktƠ8_5n*.(d~|+s, Paбlz_nki01ZKm,3(:RJ=f}:NE E m}KtEc7-W+ۉLfs]uy}X 98BF-CY>=CH5 $"BDL*xD)dhB!eNBˇQ004 Cg_sYj Bҟpoʢ5%w#dJ1dnf>>$ʶSoRt\\j2[o>؍o\dx풊 I!O:s^tG,D:HD˻l3"L9@c+Of1VcMc!UuCЭݐhB e$p][o#7+ۼ_ !ݳ`L:l.bV* PҥZ,U$,yxnN3/0hT<㤐iQ͚I 6DfQ̝彄 e"pCVJSV97ienj.To"/ 鋍%u&%n>Yg1͛}0iw?}xڏu/=FCR@*+T nJuGYAkS0қa5NkfAtO&俼LV vj=aǰR ?fΊ'ǢySJki9( ^̴fGӦeQL(}YNCt۔uizD6'XZ55[04BE.ɳt^f)ckQ**gΚd<+|vu.| ^/9;?ߟ7cZ}u{×[{9~pGt(@-eSOu]yGW^ѕ}l-ȫ "+1.vUNUyEa!K6M.SadΊ&y~~?$I93k;J:m_%}'?Cb p*[ /;J/\R㤾H&EoV;t-2NWc^o2%/ozN-2oP4bLh53bԊpPܘKZףdh*ayQ0} yLYzW=Ig7 a^x#9D6gRv4+j-r9ii]?LIg`o#2*y!inu4x#e!-sأI. tJ,Kf[RKkE @PfY={@ ~|@"tL*xb%n/:)lR!p\9#bmD-2SO dR#jE=x*z#7c Wm8cIiS]kkt u{?/`"H"a2IST ˜tTFmICV \ \~ph,M&%`>e 5ːLꬔyN7hkG8ztټJ#O+O\wߊC}VqxZQj zjд.* JsKoV}iYr eL ԣh2H7vJӜ(gVE0)eyj,ϕQJQn !n6Wdu}~(k3ʋ}}i|.Dvq1($ -c4#9ZhsUV^MfCJWv^ɕfa{> ,3xEiR[F}4*'L ͑ڢ^N-zFN31N<}6|NOTKy)=Y\hhn}=-M,0&&ϑaԩ11$ٙ.*u^_P1T6B͘ΐ e L,XA]S(-$,AR)K[ķBfY.f+RTg#@yk_PW+y"L^ƌzC^qqc4\CoG1pEI<?Kh~UbKJ;wZd|GbM7Ve[9Ѽ]BK!Xٸ:itLtkk'nV~|xLwsWjkHRW sȘl-2NCv%ފaūx6Ȑd%YHSsdj>`>4w?sݟq#q jR`2M(a^X&A0kˇՏ?NZ5f=a/jiyҼm3;\zgm:5K#k%e} Rm'7&- FX;ɸQx ׽PN0,k]u:,31EqI__fb B"[g9g>2v]u!)զKK3R\iImoa!0ӰmwSK(R:e($Iqb#޸D[Bi>nT}dh/|dRh/^@!30P 2rx"bը]* NȫC26`P5tGxE le[C,K7]2 "ȈLZۛu'pDIj/0Yr؛ֱ ~o'mi': lW5vP66^\A*^B1-[e0NgkۍSt.6 Pd({3dm Ž C&xq(ITJ!bH.M#cg$eCnl\"Ceu=`ÖΣ?/ 6dOvY\Tꡏ9!hBP V߳.hq\ sxՏJ^_\oomp{LFl PvsS~. ?Ewj[QL!CWEFY0j)?nkסgY 8ƀ\.k}'{e8|]"޾EENAWf{P[(pD{܋EGcwZd*`P]YґvVpI8رҔ,/>q-28x"FoD"%ֆDBԄi(;|mQtGP2IkѢ֢Jp8.=UR0#Ek٧6djH۾ Fw!ʞT-mM~+G1"-exspC!R vPs,n+658/^ r*w*r<(eђ54UviT^hC!}"A !vB -6 چI >2Pl,TH! Hl3CL=E0Hh c lXwF"G~7`x0c7wA:?"w_PE#شzwx|.w\[!۫viyϳe~&dH^IQMMa"]/ ^\J&I}adV'7e'p/fTvbD6: De-:Oh.:f_H`ß_D^TuSs?lE"#R9"rGzCaƛ,oB j(9cY`-[)ffW&Yc"ܗË4fcb5%H8K0YRt>Ha!XzDDdcsŞ4'M0K 'TNK{+m~ Δ̲"/ySSSlv// $v+j΋'_20VRY}>O-סc/IM]Kr-4I\ݼ$i_$^ݏ_n9N(N̦>D}J0/|”y<]R0x &7`WIE0TH+@)a,y. (̹*,`<8ս}s<_%d^8N\ Y_AuZrH\2i/B'E2)z 3Kwi)՚(}㑳sf:L5-G^=D57RY}%iʼˋ]+g%U^hնi5W6rogw \mV,t|(rzA%&$f ҄HERn#ݺb0nfeSWQea0K򶒢G\Gܠd (@Pus_oK}?iU]\BISJͳzzpKQ*PU~KJ #Fn~id|rӛΧh { ^WfaX ΀ќQ-81:OpSC)~9 k%xPg]2^pVҋ̃e,K}uy i蚨^˵I] D\肑vx*a CQ@n75C[űur`ݸп^{ ?qB"S*ҿߖ}C͠M;8l&FkcPl_,ٮk*YqlKL+S\]nIro\R^*@O3kcJ&5c;yk-VK#-U]Uy*2Ȍ~QO.DDx}.?v7rvGSxvG`-MR#5%&vGnPV&4P ?x~ڄB$;I:u)"pLlF{Uzot:iW a[c 5뫉|EHޕfajܡ!v+82'<7A#Wfݳ_5/VpΙ~9Ax@][vxx] CZ}VkN$xPGљ7D)>_}myв$4`( Bʱ;y->76ll ]YGdNh5R@>u%6¢5mJ5T Lh11rjz*bѪ{ Y(:Fcg"LGFb#4d~pwd57V z(4 h}oGٺ~uW+L5NhJɈ'7ي֪f9aBW ~0$l"c07$^Y(;tr6x&"~XN{| cHQ7G+ynaX6Gȅˎs󗂄A@55aPCgRoL R`H?hߋP37\}j!+8,06C9)˪4cʊ<͵%4`]یOnfD*%jcٜؕ`/4dh4ϵů[ڵ2S??ֽ*" b9Y#'`h#zW%i$خ)f 7bFX1hhMBG%e&:sp qhy#( f̢MꔚK}b UGb>ii~jܪѧ`?L6O=~Dڪ ![m%9FQR%mxb{%TQ ;?|gr$|fjvL"M2C0y(zbpzʎIlTJ..E563Kju#FcAbܧ*RE-][8ַnc1i-3;}fMZ5l}ykwօX_YpOǣG>^=d%|OXjίm艹Bϝ2́:o!ݞmo:w<԰a'&d:%f>OM${D%͆c7[~h'8ZH+E[$=M0 !tB9or.;4ĢɱT# 9 +k<$@!t9E}$777u UMԔVطbP<ݡh=dmվST)ZQWU\SR#;8b[]MIso4˾-dҿ.^gPa+ 0W{ɿJxtTgƹO?/΍c^GD~KYp_ƺ>&*v傽O7wyC\>;K 挃prby}tb#Yt+#,6_/WvO˟ww?NoλkL<=~lW/,ް*Uǃ+׼k[ (_< {'@"4zzA)dWU=Ж.l:Ta͡B6x9 OH<|{uJf)nQ\tO١881ݞBni7`nG?03`&:¿돦<>t~r9i|9ab9G9͜EnElVgMS.4q>8yh4nTMmb( :&"+A:01<{8ӊp->f9*ϫn7S~.?ڮW@(#n75" :jb[0뫽cGԿ]`oُG:󟏇jlwZqfpēX@16X[֭*_G1r67Ǟ%ع[uw-sIHD 0kbR: .{/dn\@KbP0^h8 k]Vn/Ȣ,yFj0f'S 3&qg3C1/t/g$$9Izv="tłCs`$/k0gO߬:QOcഺp箜]"<{P$FRg s,U!SΕv7GTÚ݉[JkU[D9BX,$\}Ta76_X^RE%N5D86ݍ/ [Y)c"8Pqpdߪ8 F8ƔɜE5wLa Q4"D:<%" S(ATI@~@1RpCLGnoV\:x 3\u?>\>rh1cĞwNGܞtp})gcv| W 1y{ɞg;xsqq [CR;0y]gk 鴕_py뗋!Y;|?^=Jqne';c3,v%_0W?ߙ#}j۪ݽ1Z`-]?mqylx__ ĉuA%7x,ܵLo.]%#ptRoqjy:e{]VUWs`F>TZWh$j &͡.)9G~pSW_L<*Ÿ~ɫ)&5#QOs 5:pG?r!-~g2q|ggp* ޒz7\ ybcowQGQ P08KpJ0' o] ėm{_H &ukR ӢH'gqoͬxV4]_my鰻~Y'(3'\` mꢅKr-P} 2HV lvB '# yx7!O+0lf(aЋBqS"OPiNPJԭiE :Yz϶V9W(n$4rlމ$JU\Gj2؈#"٪1 jI1G6[mΰٺllzȖ}3ۖT79W}F8#/_vA4qcZ@a& JLjSa;jdm dXTY((ܖ oD3y3r*7x5*#L0|uTAImZ۵,@UsZz #]GzeS(((%~6r2sN3Flip?I'֟2O>bkSl 19L:N^Qtq N3uWřG>s-jڳ&B>;U!(o_JcgଶIS oT?(9fOn#lLR#1 gafaSqknQbT] gwrǴkѸvoƵ?,lDbY`_~ߔ_CқY: N珺6 }#ՁZ8>M!ݞQTOo)-*)!R=&Y#֦LaPG?4ώF71z1zڜ?}P|zE2fo̱`ɅԑG/ŝ9L|<Ϥ\\s~0&;oJn} \# 1جO>ۈrozDkz5"chh̸V3e]ӫy&WhC;pہvEqgL^4pxWd"/8Z8 ;lG+۲rRQ0c1ըnEvsM0ot{I'?@Ƀ;{n8d}AwkIֿчM{'^&.5LJNNع?$Ygq-Ͼ\hmCiW_# z(r`t%_~-.ßa{ nM7_𝡋VvkGe2M_[ ==QwՔzHƝ IQ{n忱p*Ksn^| lc`ƙc! kb _7}F5|/v _<6sץ6X95,DCkb^욷d|Ev'6R'܌͋F:+{N7}q ,pTlXɸa#QXW+,չ/96X k/|DWWp Ʊ'I% ϷDo}4l։^wN~NN4)0씏s!ȝkY`#Lնg^xܦO6H6v5MܽfɟMvtԦHMlHWbx"ƚWkUtX{\a%*悽;Q(O)cs+FN3 U <6#Om7 xb#bg52Kt_2xVbtwK^`+"&r֊5r"G$ )]zt Um(։=^9TB<Õ]CHߑ TG5a9=k 72ф~ԑ8)< nQ rҕ>en6~_X:onwwng;/y5!(!1`ܽf0xe^;`}+a15eG?V0gF[pQzDPܢ'lܠQ;4c[+sC9«A^BT cϊi ܈a# Oͥ]zTm$F)3jM>݋q51%Gqo$|= '0j{jl)'MM^llKNV `]3c~O `}-JҼT(ޠ]礍"Tkm9q'Ta5: f"w׸7#R\GG`Fa52}L~i_0ڜm|82Ey˙XTjLRds Pns F>E&Jp撢+غÕm;=sED9꫓h||! Xj1ôgLT|ϱ`r1֍]$p{G`L8!7ipnl'LٜC>=6vђGyqGAoG՘`78& W?\ߘϷ#o`ʰSd(c#Ť۔m-B?hLNFs\J Lad\IkiS]YbYBtUޱ8? Ishʔpcf4]KV7Y "/.Cl,W}u渻`$>Qrwռ09z>r49y{1TiU n,8T mkTpQqnn TR͜M-BJ8t[3jAĖ3$+.NσcY/qw{N6Hۣulm.;~vbo49tYܣk8|۳`?ssr6hD.&q_6hmvD;PiM֚J<(:2 eImJxG p39jB漈Y@dmQ0 G7xe|S ly[Ș5/e^xwڔ{e-q%Wj7] FH3}ypb `RxwB/"~4[sb\!Ŋ8T;4-1dL+*t*U=3[ F6v V  3.2p&}Y/x2L>Oo{ϞO`J'y;d%Ř vҀuTEtX2sl=UٲbIDJA }8_^{`@w&w7Ǎ7hvT4h_1a40D*V ֐ `ցE8\\w{c l-x:Uc58B>E"u1[Ǎ0 CmCOP'pp8a#C$+A; (C5C"!fOJܳ8%8cjI:Z%,΀RRr-_q=N" eसE%o<?u@;`l{ ͢_ 9P3y88n/6}jlsv8K^}/Ӱ?{Fq^vuuuU8O'A Cz-%O5%49H$Լxdh._Uׅe\/9?R(:}a~-O=rqSQH4keFjbZmVX9>(K-~ r?Q,W5Hޚs^$h|ҍ!lZ<g(Dž`3Uǐm+Xmm@Y}Rq¬3 hm8c&9Edu Zu*]tjۈ \07~,*ЌOLpŭ%gS!9zG̣n9uψfQ$1uԅ2LI?lB\ n廰!$TˆKU;?BGű"SJe MoX<$B+FVW+q(O<@uNA{O]t,}=o|~K1|j*QK9\T&5id-ZF:[*j!*w)(3%E6O;YZ Qcvh!ߟDɍ.9ϡY'҄a;êpltFj-&8gA8Un9qOսf(C}"Ks96vSG;ֿb;}נٛ߳w/F"nGPI~z.Ar:WCWH CSw^թWjP_*QޛUVв>5qpUy4"60f4)&z[+'n*޼Xygÿq\\Y]T MTocc57JtR?n}No6]KUC(f~w*5{/IxX<>@oO͸پYWTuQu]Ij@1~_ Ƒ`1͙OM 7߮X[KrK0jpYlSM8A05*AW FFWΎs~w1'rGKr@NIM8ϋ)j琀?jk/`_YZ7^v=kb& +Ok73C|YYa, #K'!>*cKvq>1!=#8c0&`ꢏn`)nOo^msO7Vy~UO\ 57o0k>{`c;*pR۟W`xLV28-wmTqBk_o"Kw%#lRC'?Uvs18z2jb"L'rbGGGR?i[#sosN>Yr>|Ň)x|tK >1?aNU#6%Jw=Mp5?݇=x$ @FjWAHjG KG۪ܲjlݺ\F被Rhs)Bn4eU+{I1Lzو؀8vWUTy`BN CRSbU{;+= EPd'drH*!^E(~Tl) V%<   *lM1"69fɧu_"x" obHQiL#)scڥQvfeK`IΧC6z|QoGsOu7w`RF-ojL nUCL8Nҡ& yD!>:ʰIT8* )yN{hqvO?Zaplh#.TwpClj{8pAױ^r6:D@9!8VgY]n=O2j*=ˎcs7]v]xm'DzS3c>}Hv]s:*yw[b$$ MNAO0Vzy+ [p]=Oj VgkAa*6g:jo$++[ZEq}cwG%+kʚoI::6Ȋ=0tllE8'.ގs'ېq^9 $Pq$p07ݺRO`/};8?FX|Yrd:,Uq64ݪ؇ce"WmGE^gvmFYNR6@>`:riqr )Pxbٖ1ӡ NVݭ΁l) ]Ab z*KJB.컫lˣUfQV98f麫U[AX,'ӻW璵_^Gvvago ~dE׮IHdO(FƭڻxhnL n:͏sP"{란7M٪;ǠJ͓҃Qqc4GۚEtFy^!?L}]wH|#ymǘ6zMaH`͊39 ՍR,ĉ:/vըȝeC~<hzEzˌw%czeMC wMȩ{z.ڇ^}D)]U)C,c1o<ь(ISɊ>>;deldpq~P!:FC;"xY}H 52_E^PkxƒY^7:΂ #.3a0+;Ղnu70=}$wv?d.bt.d7K;;jhZ/3-!'y&d8MΣWYL/(>UO=hy%D>jEGLZFÖul,+zF<5T\YEȉ{#$+<7`KK ;aDhȕ%?[ܚٰ8߉㰷HOy>32]v&WB 5ēWBNܓSX܀`_OZ'ͩ8OzX{8 q{qVTEH8&9O}.^Ô~#VkΏV _ؑ}UQ z- a=2U.=d̛,*얩;F#ˑR UN!X}^Us8Gr;GL%9G؃8y0&Z_;:n)ƈr}K6?Iм,R\WTݎ`6;{E&~8U3. uJ7 H>A/ 1IyB3RzXEvk&.wԜH!:7!h7']wtI%A;?zȊB%](ng"Rޫ]ãT*݇JSanEUWWZUc+%Z^7>>&UW0۫=y=)Tո+1uo]Q5b:.Vu`eU2a%&1BUP9( ${NE5"wmHr{i/ŗ A;`wf0v lGHLWlv[jjI-&G&YU|YRl)><=rBBxAxKhM$+'"!GG ?w!c*UY?k3Jv}vOav5& 5Ck!M7Y 伊Eer^ĬfݣZiV>\:'߲ ɈbV›ج6F; Bz #s-TNLxXgϿgPH>GIߧ7%nz~}17fB󧲩dzg/mV i\n?myiM5;/6`]P2ۻnvh44absc=Nwz-O*uXNNb}qrJK5`:?.&5Oy@{L*iЄy$KNCЇ .& M?֬gx )UBSZպT˕]^Lfs2]:& $.2ě$Usw$cF Wͩ @Ÿ$`=z)kyvtZ$P/*z/&H$CxiS`̴`YW7N2z8cLFԘ6~"|Ц'^I1C jkb-70, $yd}kpߡ$wv|ri`o3,m#$(˸!N>ԎpSl?Uw"_S@[vBN@12 cm޹0N4'ǚh; fƫrr=BGEt4Pj٪nNe81_ê'Ž"{,cAzOs#Dc}{)f'SF#ւk\D&:=ЧҐR笉H}\X)k`NYr-0<970i%M`>ثO;PW5 ;C}S)sw =4bxC7֩8`@>i'$@#5\q h2l8W4')+&ߴJn%dpO\P>wSWA z-je521ԣcx$QPwPRu-(BYeSU#"Tz ]'XjQ -x 0^%f(N'q-6Jf3@ߝ].?x^勒0%WUHgUQY$Iһ CnT/z-ܟ;q |!r7_H-TԢ|bje!?ک"&/g@e6dDȄX jTq|Ben= _BM"w<\J ʖ <-+!”:0\NBQ_ѕ!8+U.H e^ƒ,W("$ЂIϤ 1"]lѣkMi;(fҎ2xm(I۫tTPvi5aڤ&H@aVeδX,m-F|3V[} 45XZ1ti0m>!^$Qv04 sYLty+Q5NS;RpLԎ0ϒ@6aJ+M.Q'cztP&YfWQ._,%f o$f̙ <&fg㈹R 4&fz)t{bvOHTbvaFحp!-J %j׌oD;#5>X*FC-I4{@z*yXo^qb̲h0codWgg7080QZ'W^&gQ~L<7ނc=Pq]V>PhP'Ja9-Co?ݣP]4W7'k=.)UˤO1f~d2?MVrh.ZH̓^T3Q DEZ'{W,R"t^RvlT#?in7@&euZ[bTedkH/nח].M?}{;*rxt]г|mE' vTLnKR.Gw~UV2'cgE[!I Zn}I 6>yjTJָV$Vtm3n>ßo]|4;VՔO8k>==5ZXN܍-q%Ϻ`vVW 0!%9nr"Ң.g܍5^ͻwF_^O1F P0_V(3CD?y4c=|X&(d&܇)ffEڜn9I-Ż_WCY[U_Uӛe7S~ihmvZƋP ^DMg("zksXW\IͭiG>3y(BRta\c:dոݐ};vMa| _h 8fs%Lf`& m!Qٚ6nյnզյ}NB;>iq;P ɧmz*G򘒦vV9eaR({ 6{UΪ0;/>Qwn:kz?"K+T\|=k< A{?op˻YxtZ"_DŽrj:JI{ZG:ud2&(3]YxJm,͢ɨa})Pr4)Fr:iɘ=ii*?Q9aLEFANVEZozdJi!MeMamPak;rtO~z%跬ʴbhf;O[RmϗĠˠ|*McvpS[|̰HtOV灡5H'tqur$\bp1HM|e1/ c݅:Mը/* XTZ<=Y}o^:f0OyVԿ3v_F߹\"Rj1ړA3;M̌Q!M/_hs:ގI"տ -c|w˫%u.Y7-3xva63*{PgOXAl>̡(׶ *x QN#͏Y~=!!~(a(a=Jj?]f[Ko0w2~eA XVͥT*xռU%CLlз @cfדxM#TSͪ]iލol͖s)>·Ȏ9U7?Bp^+qs0PrSak" cMa0DC3f| aFjI(Jka˼3isZCI߆zAI{XSabZֶi&"`ŗIJݨ>CzxBFێ.ɰEVV-Kp@(ES&FA(.I (sġ|AXzQ e'}1 @l#ƾS/1nh7˳M%vSU.Yk/6ZjOlKu=/67[Q4KlF>a\ KD.\YZuT0dMM!{Q1='F[ĪjQ'1C9 $ӹOOBAW"oM鷴vȸj"{XdBת=];KRs\{2,2,z$K}g)Wh|i ' _5->Nx! #ґWπ%f1-i!)[Ia`idj ${$! 6/3Is S,%[[e,3V7U.b5 CXt).dʝOȋ̹y{ZH'fh_ Tzkǒ@%ӆF:\hyȧj.,՞ڷK#ALL%5kV՚Jsޤ2Bq=g<#(6QcvFR8=pk EHR"M{TDhRzTRvҟrROOɤjP[srLʎ@1V|`'ODZ`H Y>r0O7[wޫmeq}SZR|i.GMS~9-fhm6єr_I)wƳHwnǃ?v^qvU=>-8\4G 3$t'GJ~~\~hĽpըDips1FoR$Y)܈/k/G ׍)nCD 䒯 N7tiqG;I #LA'0k`E?,kðJ0-[JѨߢt"]V[ۉ|+H _QZ)xwP4Q ̳4<* |/珧,մW\ _WqrS,P}qL qDV#vA:QR2CX% QՆguk<$שL w2bq6ߟ A4=D,l?5h x97*u:kfh4O{ޠ){wWXCib%( !hU]3y%^x,ZXRK"b R=sIo0/jO 1)E5p55_KYNz?u\TE%L %pJSŗ-^nGp~l/їdR J%IeR@'0 $t-Zœ6ƂARHUU%F0 Ռ9OS PhSb(Tsu e|?vNFL<[<)NoTv6Q]CsO6i3Vi/]͡` Zs97'r\2|CWAvܓ9mG 1B\pQBʎ{C*22o?oVETBU\Y^T-|_#4oP(FY M._| P 7);f,D!c4TYqlU\,!? (?)%FaXeyW4 ټC̼xozə)-k]ߗ86zb%0{ʃ-F[` u^#Ӷ GT֕:kspo `i>2[ڟU8H W:ďYQ+#1|˚yrn~T&Å.Cx/ֲH'O(UC%89 D*2ɧx^ܚPgMmmдFEg7-%Z"¡=uϋ z{^"M鹗(r$Jr6'0Xhp!8 8+mhf\tAx-R '%(M `|=Ln& /aҎ4?0 ;-Ac%`ռC! dJ\E}~%_*Ts竔|RRU3yEI影i557PycT$wƒ6j&:x8vdr;?W J4l_N8G}2i1[(x/G;J<|pqY9\1$I{D|C%cj0ʼn:h.Ȕ8X)&G#NUrxƫؔRzIXU+8wIjӁhxd{ sxUi"Q]g҇Ls1Yy^&BN РEN;T6kJiR0 'THpmhŽ~t|ra uw/MG5FyB4XEIc$˄pdCQ@j#E(r؊,Udhq-ՒZAtU3ƝWꄝrN~S FKPNN8v;UX{raZ ': wCP|, ] RݬÜ^#ktýuh nݶX f)OGkEs z`A$K{mhFf7 PWJ^;ͬ"P(J,M}F)sd9vȈ(Fsdv2jlNv#$L ^anj TcD=Lwk&_ܵM̺ˇ?4~}L.׽t۷0k^ho=IԺ^e vh {-@!΋CR^%ϮdrLKbF͆WVllCi(qW;;ZχmB/"U]#2yO`j*z$!!ze6rKB״Ewld7n!Y9$6ZAysN 0 To~<8rE|é,ѿY2pM$ǚZwC(Ɖx]t*Hw⃕Rgݑ8)?߮AUึwwxAq-tmL߬uߊfBt-w:Bmm=-+gpt2Bbb]k*گ#TO?HÅv]-D\:?11>#ų+\ޱ gO: ]ƁCW-\J몹U 0`c.kPjb.SB-i>HJS@mRBvU-,}x>w|Yqs/d@f$oò*!x)b-*L(!DBKQ` JxF#; 晱VKͩtLI TwGOJǻ03Qĩz'zNYnK#(txtȸT+ͳ\hοߢ|µ&̪ްX;gy D8*$y0o̗DK4`NAܿA&BevBRY>LvL|tu(?*#H F7᪩J[s]vJYC=k5W8щues/wA\zq߂"Ỽgfzkl5Lbt޽=y۳o/HoFߢ7p7.KrFphٿq>~k;I]md+?< M<ͼ 2XFJrդ겛DɎhUwWUq׋2B*xC 8 /2]E0QO]̥4?0h7=Dg?{neV_б"гo `Nxx@g3<4"G˱eavQb~jނqloo7,wx,򴘼цU܌ -]|(~ۋ70ދ}o07ۛQ9W_?DXT^g}?ƔCҏXx^u9WR?(nB4ŧ2n&}5ke p;>70 ?)D%`X)Mw>?fz/?)3cȊ^/-VbdHT+Q6?*,3)PfZ)FRhE85ƋNf8ƗsX0x3b"?Η" η 6dEg`&8rq&^-9c^j\=X`SeY64Q6+%" ?`b !'C z|e!dP*k0p0pkR1!_<0eM”ZSHOa)L90~0”aI Q˱5ƄLƕ^ʹ r$J(:>X2OQJZBw8E]Q/xbA}B'$k;? 2Iwdz!~n0'ᤜuπJoQC~ی]^Սw_.!o;A/ yDgw\7;-S| Lpswop""C$ \\ {5Lh<W?,g;;=38A2O, \9 W*ۗNJtazpR93|bMrdXqX g0m66Z,L>v <@GOOFf|Kfk | &GA+2#% 6$m2VbLE`Oocq*|)\Orse*斃^$NSU Cāy6r)< mr1R^Uc+U!&ZpEY1B+uj;5F052R7Uû7)cj0 \qM)JX8tq@ĄjQ=@]L :VgMTO/#zكFRDi&b=k `a*` P'0;L@`[(4ǜ!Lkh^Ԍ"0l%o=1 VC+"T<'S "xb#o"R!9+6ic! R2 lPj5'9͝i*kJ1 y _`9aH&<UH[pf"Fayh ǶK!BeiW0XOt*\-rJ12E!7`{QzOY毠H#v0LW MZ0Y4RJس.Es']KYG(134W2 H:jT,BG֩ MY1 YR;X.L`-1_aR][cbPW͛Ǻ0.B9/5nL3v<X܏F!&LVhtSQ~ֆkDh/Ϧ XkYh~ C !*S‰92&=Z 7.zd͒,8V[Ŷ#J%*%A@&lV-n!{c^}oϺ|ON[Nӗ?T{6+zדlH6݊+\]1kX65LaJJא17۾1%)Kg8]t{l&x.t֌m Р.5eә}gnm9Yя+, 4;W&:9zbbcnMy:MQǺO;lU&~n!"NAzeI ]jۈfJaHED$ҎEl?UT1ga#TW8۟' bXvDfmdד# eq(ʕeCzZMGs(D*Z GOGVZI:Tolw BOL9Np& ""ʽٜ!İJGsSlFT%\փٚN$c F<|XI~B2۫@$uaݮ@SƬhr)(Vݲd2a|L.|?;fɜ2Z!: ߍTH ۏaF%!wS &@-vύ;XO&Pĸ4r9AC'2PwzwSZ@ah7>r;E.ňSN:Zhmֶ)ƒbMAZ+ ,7p>׎1%QK(GLBz/j)˷Q:P)&ls oW@EsjYrZO*q,WVJb[GZ"hH=ƥJ#I7WF-ޚ$)n]UtrL5}VF(#!*շc"}ڛ*.?9Bp{Sh4V䩯\pbd.aZ"qA@0s&FHМNk*@ؓ_S`[7^ڞj;Y-e+1.ufZz.\72. 8), c"ؐ\ˠrO=J;0:gjƑ_qL]jkfn?nfsߦR Īmmyf2W߯A%QH$I%Fwh< *_X&؇HkYƳG ٝҌ\Ѿ/( /`$ *E %A+J,=bXi5Qr)+p'CP Bܘ8Q^b c-{_P9wJW:DJdn+e9; A˒\@#!3SU0hL:bJCS0K0D" àǖY(A\IVyD9?._?_+)y7S^ib}:n}ET._z~{A\#.图 ﹹk"YsO%xZ~@fyϵy\csߟnozyD%F+/n kēx@drY̵ڞfA;4Vq*E@cu 5Ыf /E9"8*ms|'p(A[Ijĝ4()1(\nPi֔e)͙VRh$F [ s7j$<0L]m^G•J3+Sj#F@lst"Squ N{`XŞF0GU'0Q "C-)Zh<GA8 O7fl۞ D0J]i< 4qiȐ7kƨGڕ(cEe98 VbMYhI*y>ʂqvh[0˘ IKG8\M 9*@j%Ie)EA;EŒ"98p2ċLK?B-J(E!.'1slEDY!_h@:)SĚaӢ+!~Ӣ ѝԢXak !%kexHz{.9bY ?# ww[]~2|YeT.o/cWo#oc+6_yMx~ axJFd`BLW`\>k0eZ|py,D^R`i/)7  9Okhi\ -5xB()B,_=\L)YS00i`TD39Gc R: WFHQaNlŔ o_\YLUab ˆdi^LbB^HSk)#R3#~F) Z ي `-v/9Y1]huţ=/(mOuPPុ h,)._ K89Vy6a%A;bU_zW |//E~%х22z8ed7+r&:t@N%rv 0+_7,hCm~xO(9QP 6 aQPt>h|c"RϏ*(HGf(f~7N+]<J78*-K8Ju(7crN2L3㡘:~NGu,I m&S&&fSK]&(''!Cv PK A@) Q 4h$ Ѩd81ڔO+4RghZ#L[1d.UH4g X?)hΕDžй.q- Π;H՘Li0: if?/בU6|eJ?Yu[jwFWȡ֞m6gj@Y !߸DjI܈,'n2QwtnF_JGn'hN 4к `be:bݦE'Y莆Z!4W=:HEaK/<kQ A)GIY~\(ehB>bXPxCGFfk?&MS \H 0Qh9a>VcŖ4TW~ QwX GLfS:&RຠE瓧WQloP)pBC1LT)p SF DFICdBP43F怍vM$8`f(띶CGHcCį`7%XK Ri)-#s#L@s"X^(bNۜrawE x]Xf੗T;4StR76W0\tB0](( Z8#K!@9:ǂ2+Bֺ:.VD ={~)r;2ٞ}G`*m5> ABCq)M"-RAE6]쥘r oh7Q:#1uAľurUIS[莆Z!4W=:%=QCg:co*jM'3O+ ,]wwlkUKAqߠ+$rV4Ls)dF0"1)6e{O?X(^hLj+qR0B8R2yam!@0U\&a[0<uGzw DgvSg(m_ReGFN:D227sqeo﷍ڞB܇!YrjwX@AO)`xA1ؼeΒoVZ\n)A_=2dxEKJ}Wgx?:2׿IG9kT$Td0~] m(YɃfA(ǭC7!Zjx?,G{mUSZ;f㍐ey8xCVI:*5g4$X[;z'!! jK'6" 8~A黳v}{޶f *5}Wbbt~ ME #Dv˨l5ahjfH.&Og%N ^YА=IGֽ\V{p%ؿqz[4>D⹂=׷7mZF0ՆL[^7#]7h|3.#y9 dDG:?/tzt`ۛ1PD}\.s-{XDq畒fVܠёVBtf iDDn_~ײEޑTof'֩j}nmRzpQ$)\l0 ($Q`F9*TDPYH*?* _$J$OOq!M1Ҧ:|TT'3JH*-)IK#0-G);6qva ҋSlњĘ D1%ΩҖ"A++sce1Lh V0s7&R#%ցf095rG%1olO+wtရrIBcpCh7:)QGep DA=0y]Xy]בZȄ-XpJvP?DB&u)UJ)a?cFrWVm1+>ݘ[fcm*k 򽁙"`Шa tco*Crcˋy|\~ʆMԅH4n0+ eZ1LђdNj&ݓUqc i  P$"uBHX=$#w MEHS@ҿ 4W9l ƀ"E,(w՘wN #Ec6%(C.~8HV2'C/)= #k #Fk l0#YYm5L'd` -0Ug3Cy'7J`z3rLm1k,=oSToE~\|Ο?7!P\|B}O>_CQo=>:;ځw'O@aԧWWE5ڡ_-1i.|ӝ=JK<&UVChdVWy1Mc^wPWE!u$ Y8s!8%82-XLӀõ$ݿTˏ?d-d%{eR=QIß](ŚN3yzwETT x+ڇs?tJsae$ S9rchf eر4)#Ԡcy{Ey6-}EқÓq7rpo'wBvv/VwzsJ1!9Wנ( }e^r雃Q`ܧp G(u\;Vyk(7,IA(xR@{pWp[C:JojșKԂ. |s*=Pw Bb)İjEoT>;Н\WWGCOBܲߝBxPsFO;o ^b'w ![xqy~t*;Eh$m9};hTCQ,7Z| ϑ6 9trК@; 2;G弖?.\)x ԠIPt kqcy` ks%B7J+XaҏsNШ& 0ci'?FȌoW7⭩\/YqDd/A%#FPCʔ(EJ2S B\Ȉ[n\M W1T:oNޠ˧ۺ_(Ky2ue&ezUfv[%oP/vy-(@V! ! V*WRQTIf92П ~[LG^my}560JbiGM!?uyGslfRx6Vbi!`U "͏{X|hN[9 Z[;$vkam6EHC%h * ."DhG1 a_Z*[I)_@9c{\:4wC跦C]0))/,99/J VD`m+)BB5A[ͣ_6ܺ]>^Z=܄%ߗޕ>dLF>XQ-'e{f>d\}a2%jyx5el* xȇk@,S;\0 UzĕOMR[dZJY݃1!?wO;C3Xq+k+#L۝ =0NHz#}W)0׊$OT֓'*{` э#ք[aKW햢S3#ԅ5.Vaن@.˟_2qɋ7Sݾ`22F(;H(JNx ɨ8V< s3K%,hoq$2*vL"$408d$%s.\SQ=_tǡ|??XI2n#A#$A|Ad{≄"3S 0L =G֣6E bÊ}[7N'g ,/8pg TqfpRHcf2) B*<-E(;syo@Ղ蠃X Å]P4CC]Z5Z=fۀ>)[}К8L2CH+(UAK I@;XP5A5 t^.T"Am&ꙍ;{4j0* FMDHI7vGEŽ?%X6I>~NF)tBJceL#4< l*u:6vd ?9m HXRiTͣ6 j4\"l L8 ҪMgHaCOG[ bQOT.<1L\q-yU2="9(8gtx_=B4~זWe(ʨ[?UU9h.>1Vbk&8Ǭ[Xr\t4} Op$]˂--yLΞ)SV}ũYIN/V<4fOs6أ+d'0"ND',Ҹh8wh-g#vy_d |og7AR"I9xDQsFg̊$ *?tޮkvy߂[gohIH -UMvHJ$ƽ]AQ:yΰb,'JQǰFZ( &04%Dp7 B)ɸ͓֘hA4nonҫ>ą)@6t&Ư &fjibt6-Z:sҿ7-c4z:aFh߾@f,ЕB4l/lxO A*+;'ߋAPvxaᗝ+hWI}[W߮B=xl Ѭ:TW %Cx5йiLЊ2pmyc /> dzGh&YFL+fwB )eT1 Sn(,:)1> T> WZ"_Eeo9 Ų{aD- *\=1ateS \2mukDV<95n)+k7`ch'oΪ| )☨*U m\}MY6bW/o^]' L]#* :]!zwqb1vWAK,kX~Z{H7ρ gpe{Ur&HopSyQ1!ǗQWL< OCUȤƧUJnkױ 4D>]ϑ㥼ik|Dt:2Ao={#"ޛn^1W'_^S5WkG]g JlYLjap瓆wJrxskCpSY!yo h.(z{pe@SMAÓDVIrbܪ^spU~ Q,cx= + BԦyopqu ZeFRA>h7V2*Q!1zڻ65#ح^45i՜?%Qu(1'^/MpЙtN2k Av '՞dF2(SG(G e~1xL~뽏@އO wKrdv'N] KrQvªQ\Er*`^Vę>G5;FY4plŒ{kv^F:AѪ[bv_u TĎXR ю"d)bp8bl 犃x,J윂 )!RO>{jX9AI5<~[ZSa܉Vd>߹8Z :ם_Fڇlp{3&c?Pdqqg% DoGaw0L*sźINºbPGtRĺo?!TYdBZ6$䙋hLQO>'Yn\Sng4nG3%`-к֭ y""S1Oubźj[*uD'M•ynɄֵnmH3"uӕcnrhoZjT-ڱ[ E4I>w!+Mˊ&Ӻ%bPKtȺVƽ[TBk[$䙋L2{rUne|90(=h|!w,/m5 3*{U>(Z7]>shR8W9/T? =|=[2rnG/e!?*8vMh6hfH#4:#U )B*3H} ):HQTqg6)6!#RZ:DD.!q8- ky'1iNVO8e|lեq0{ύODП tձqFc䑚 b6Y =MQRg4P6TܷPTE)ߗQBVR a"\ƣج#3/~:ykddv6ax񶣛/C6,piH3s㬡#%o7}4(!E}'5w݆h?YCXΤ>ȍ*8gx."zYl!|c>cis!ϚON'E?啍ͼ?ٮ{=4 j kt:D0';tQw/:z;[獹|O Y[7R_r^zKITϯ-'l|SAܟכw}Yz˯m3/?ԟ n HQ"w!n8.-zG|0ΨHbHd83,r !lU d"N4ân&Q@nK,K.Yſ܁ٿK|_W ;'>"]LJ&"67{:ލӗ1EqǗe41ý˷a<>s_cހ7^w֖`Tp&^bj5D<0g;Ѻ5$](Dz{bEYM0YqB8g3yMk{H++ASxt 0}:՘(a17m_H޳I;8l7SXuUw]b[e*ʹL+n8(wpnv'D&Qj%~=m m6`Z!C27Kit2[ 1k%+) VI,g2HeޚcC XP)LPKs# Q0AIY5L?]FSLyڿRae! Ilj8Tj4!c -2-B*u׎i*q嬔sjsfJ } USoa֒@pn", gʃzZG 9RS@"mfe0> )v)9h+ Do<!=*YRAOp|2m$2id@f猶͌%3L㐓sv0f9'ZS5m h3[,eB8ti@ YMB ቕJ12Fy+c0re HЀ`|GAsꦍܲlI-Q#$ ko=~a>/~Hg> ATfT;kjs7֑/rZyyZ>촔2JD/s4:YHV I E|.ɢ6m>P<;[Zn݁ ғԬdjעGOD8G ̖DWQg3#<$ &6 LjnT䇏6r?|c"oC9UjnkM ə *L'چI⍮Xhɤ!a_0F)I aaC!%x| IFƃt/Kۘ'mx3$)9rN kQCQejfeityYoAw%"f^9 AN#@ĉ_+] E`x8Pݲ5uݲ( |T-bk,}: eA` _iFʜY,{x؇%}0P\#ǬO4' p89ųd`rZf \ 3KƲ5A$aDeC@Q"Cs'Rs7BĖ8N,S*T!v:7Z1mTpH$Ge8CYlVΎвbI #BOST~#gŴxzO}*)XǙϝKuP``(^k)DǴ2OeV !l<|[+8%,Ě~h?SZL B p0GS2ɁhiN޲Ц! |Oҷ,EE;GDqmKb6S!X N 3D3gPC>[$ZV)e<CQJNc^ ?k'K9Bliv? U;T%<@K8nU0 :&csFcu^x qao3j#6'^w*I}vL rxn3TZ{g5Gةw`wt˯oo[[jTZ{Xa>\(nOݩß7>[,e[}V ;sM*g߀zoy5ﱝ!5l >>~~DUvy,3f DԿ!Op{Ojj:5*yGԣ&Ťzh7͒*lxmF ofHFy߹]dž=ܑ`9dѿ8Al| U%6|`ap*]!kFt11N⵶έXj%DϮKIe*4X,lw6,0(46!xf-A }j:;ptk=_nIM$=zP Xojt@/0jQz'UR\PH 'WiPT^40ȵuT,D)%sLPFHTeu;̭!:xGPCDeR]ߘ[UO/– M퀛%W?s2.Rܶ EPIMqͮ~(LA^$lFÛTR*pasW\O;2Tƺ%Ӿ]}t|va:_΂@LÀ`̞dZ/u+P1me"yNj8Fa[ah!DoD +S(GPsP.Q7\2ہ<+rCc@CnOxtjp$r;iu,?OpJxb{ʀb2p,OF1 l1n=ӱ}Maeýڨ˗o2z.r`_y$˲uep0ʫkhHǾ}QH/,aYz Azwz-s3',ܒ,| e7[͗8=tsԍ2rWVf#׉w AT\ɸMb4O @]nn;Fy1nݴNug^oDf+nwR5y{y|o wdv~DO?~0+$YXcR "4YLpRˇoNhݻw:e b1oK%9i$"Bqi$JnR_}{s~)O% [XDqeXͣFO:@ 4i1Y$f\p'$8L_'I@\.慖IfLab$YYiQcץ:pR:rҶ=T IKI{L0ņ >'jakZl[T-nh 9#`t<}LsBQF FsFΜxD*1f$9sf]s}ZHtI=T xbeMRJ/Q<ΟIղ/g 6os̩3i7p±EО  ܲs l0u5φ?>D5ζ;2~S$^2]CLZmx6W_Hc \đ4"]5@D;u9r0;@J1B4m NYS-iҟ"zd H!Cdžs, tmWQ#W J 7-?~ӳsݦoi:I,GNx4H& oo*%["Τ(*%Ȕ(&6JsiPTrJ$aX0,)bL1 <2aތπ^jX6iv$_ac,2"aW$h3"16l"1U$1YO6A81f@5 '.(Ϩ)T`KX,͑ 0Jq̆c*0l)RD)<K+rޘ%Q%(~pPs5*GG'8I^giF"1+G Rr)b:e0*d8&2hP>1NA9҇xQK5f28ל$SH8΍IDӑyi7AF)5G$7DHP4CJD2f0&2I8eREbChH#IeFYӔ )PP_s$$IS.HFr"g%FP16+ uE9Zh( h&H/ |y4qRG6sDS(]ͲlbU_׏ܫSۛTLVȩw6; ::!1HTF 'N9XՏ,6lhŨG]Nx}3Fq{S' +ֿUOr'ZeҿN({EX"f~ B`'™9<0a6/1`V@hqaztMrZ-ǡ lSd'ODX2$9&*T[S'uXd*f8 $B=:]A @eU IѶd%!uL5YWK{MOt7&KpLv7&ג+luHX^Zl0Nwm*#o.sv-V\E.=${ƍ_򵻕%8[_ O]c%Wd%gdYqnSm2 ύ琇2w{MZҜҨp8}|#S9₪L}.^} jvݰ v2JZ? drd ud5(xwouɱiIȰq<|;lrۇt0ï_TXSOd]\uQn=SZa\3L q=vsG`<U(=~h9u8hP*GWGYʝ:;(9 *0НT(aR ߗ63VIy* }xNʐL-'G.ޕ`u Q@QKf f "X̑{$;J 0,w;bp#I4QPKYz헾FNMs#6"ϬάtƠȜywLiT>%hd/ }2;mŰО*x0tH@$vz3E'cYIccqDJ&IʦEƚerG\26QɆU#U * n/ƜnȰx{X֭8('kJ|(760U-q'qb!QXiCppsnE[FCsڅRTb@Jzq:#AS'j'n,(R$r hN0mCoX*949)'Y,5P H#*XFCFs;w%NEb'8)'Riq!4st(4 L#X54jF0SLqD)y/I+h{$F@$/!J&"QFΕFz҈y/ mT Кn'?F ҈Fu3 ȁ9L(b> 's:* ޓe}^eM'9/ߢ_ZZܸGCP2y:j8NV%i%gQ"pYv2I: KqfSBeQ%I v܊ZYr+jp?%2s$[qKnE % lx%:ٰFɆ2[L=rtr?p QGe+_go?^-fW+Y4[R *}W/ s zs0P⧟wyw1tꈍ ?z=| Pt_׿Ss-bߙ.d{Fo.>cq>oȸM8"IP)ҽ[1Ri(WXXff^s \2eV,b,Hp: jJe{jbt[`t 3oYt.:AK@wagG-= @vMCӃlC QHX'0k0//OMWv6t\ .cε>ܚd&w5>f|巗')R(!7S;/7)ߎЍm +bKߊ׃]HAyc6剀iD(H:Zod(A(?Fs,%S= Q"H\ qtONM-k .-&5JSӵ#E8'nۋR H]ovA=4Qű۝; C#uH $.&ƽn49*iގ8Ȟn\"j.I2;T\P[w+3k`%rDs?ówEn?(Hd3kɸu*:KI$8hѴNN.됐.)2EXA!Grlࡊ!>U)d_D A#ެAGƟpޭ~܌'2\r j@#uP7&v"5'{@ ޽WZD^Ceۍ/I2-& I 4<#@n̓pg`k6qg"D -ۿƤ&MIm`͡X AbǺ(+7o{͗]h$J-;[4| @-R:atFL3lcaH\!d14\lG,\@S@|n~k'hIfPCܡQk/g~H1lreAczӑRo f0\_Z7Mg/cˁ~?*X2ʡ#HdI9TJ|9ubp;yb x)cPn?kF>S|bfU?|@^P=Ǖ /ԀϚ=) ԒzⰩ7=7Z#J] ڗ/ à '4.t(B1{x>L#2@ $^~^ͥ\"K]Lެ_lӳ{ExTt_WP;r2"Lʑp bd$Y-Fwo+ȺvyJplAXǖG1gOy"AccHim=ՓyHo %M ZXc~$ߨ|ь1ژm ,2MDBKbJ'MHl7wU?>,>CyeRnҝ=@d=@ `N~P9rb G `3ʐ98x{{z`8ϭHZdDcD,Ήd@ȸaь Q,ɑmAHƟ܌J~{{D hK⊼7 ;^源,pZw{6"gv}Kx=y<I:b,^0  SJV1[U9BB~GWQe]oIXa#bʪ/Q`̪̥ҩo8'BIkfJdDXEX%S7Imi1<غJ. L*q m@(idfi( DD Z.4OK7_zL.9_>͛sIG@ޕൌ8֐(J!Eܱ:9EȬ1o9G-̉[fNsBD⽧-1{ƍ K/[{hUVNm: HC I9vR1M3CJf#q8@n݆(kI؟Lr+@!&ӝJ汚R!(ngj^B^*c? no TG!|i§a5|ZM}g G(ϙ'R*zt(8{n窠as&~|,!8'.BQdr?uޢ b8a-ͮu<ҬG^O>_΢ -wob|"W׺djOYV]Lg8ޣݘs wcZne> !5kwuIamɱxQ3մgkN{t1R}r1jՖm?F.^xefǴʼnRw唴91wmEߵZ[kl#pVwb܃zvX@#*P=}4gdžV0ylvJuhRz%lEkuki00`FXwv`pmDd+x\6O980*%ZGtA4;QM&Wv˙zh:͑ʜw"dlVLL8w,,%]mFʇ]ramv7x[e֝/5b[%4viZ9pUp]{E+5׶Tx [`ay!, `!TZ<p~-TN6{.xysQup<%PF_$VeLbԼH@-0ro=E  (>>g0$N/w254ߘ:~lK16s12ݙAmD҂JaԠKӅ̨"ykAsKVckf-28y9+Q̋}ld%Qbb 4-p?910id˰]_)P[mP_2.f˻iJ.(hk^ŸHBS۴uZ8mJ&#R{6`ǣ; LsdN;@ 4 ŭJYc@Yj&iVx|$e dr' Dq̮kEO=1pip*EBZRMm3cet2a━|C)Vuhi yos5M2ӆt7;p`AJ;>^67߄۟*MBu^:B5.$ۼ*1U&u%qfGIP g9sɾ \q +dwƭWR]p4Z&wceCF&:jSj ܕu _!>դʑ HK N]`0mu;ll][촌-ant5IؤSE@| CE:Ne` )Sܖ21FSZx!ZP?SՂ*Qj:EVJJ!XN]Z0q'O.NHӽO)ئA45'7% և^!MI3Ȝ%hSEKzij%#P z ;k]j]ϯүڝ:sU@ $p[\b>率=e)hU;[^ NMPRǯ8Σ*9OsE&ua1㊳&f L$tV ׳}ד8*}I;eYЬs::,w LFs[(F0M *LZ3s1xD&,ZZl԰&ܑ \cc3QaP\ fgH(JIyg lD(@GgNk=d9-qpYӺY 8PaMpF1*-|O<:OV%BRbTU `@GiY?%}LR揳&}ň-$<`dXz$Հ8` ϲξU 1]1oĨH2鍦ei[Qˊ +kZJ1UQr^8NF U pVa* g/pqg%.RXsȄY=ϔ 'qk-X.5d2.K{BIiOb?G{ տ1{+_U:O|uJ2}%ꈺBɣpcNx aGUԍ=Eh_䖆,%;.4[MP !̲L,ZSJue'/dBQvkܧ2=EB%=|#{'Zc*^gmj>qhhx9gtm{H1*sA 'kލ a]o<*?Px֣n[;;z{擝]goC FC;x#nY电|Jx׆ *vDi9 sL0atVsB4Qcgj4)|dYdY!Ј`P o ERJ (Z3tI@r'7}.Ml{E EV :NIiZAzɂ*B9Bsnm8AA6$FIҎmuV(u(IRTXvF8h!4 א &DBp= )qoaK)\J<ט1|+16#NX0(HFeFr2^P&u[}4l8`K|E5?8ݢ.Yن~+5aK 򒥤}e~#1@hR/+ P9=2.+2iGQ0әJS$O58u0\n9րV6ӽe( rx$I*IF̘v]|˲>{ی.tA(9> R?siu8Wdh/9\qQS9tIWTI DQWJۣD^u0uxȫGϐ#rnGܺ/Ć eV^~?ok,ה_%/*LW$|ǩܽ=c>a:~d:[hw3w웅Z~g| 4Oݼ'&tJ|d;')VH)T,\¡ܓBEy3[[0L{ UTyW:uبiS^nNQ]1f; }ڷv4B֎ =yɀFX\bnr(ц|`վ#l:xؑ֟a e1VE4 $&rmRԳήC.Fi:j(ߚRK|Xg;4BTtg@rҴzhe+vGA4"( $Z*҉G=wI;IP+ՙ7OVe nݙ3F0\ ]ΎӁFIv6>}>z;5WG-{Yᑀji=5d|v1A.}իw0 ^ys;iN~ -WҶnry|jNw%o9YDBaB(V9QPAy)R\-Ҥ@5W$k Cy) ,hV!ュmj2a] *^#SIXBS f?κiUKdz*yq( dGI*acaoovu\+ekE/󗾳 5cz^\_ػ8&x}O!h|Y-!8n5e+}, tJ`rl޿dzo~g7͓#K*rvӗHc>@yxy>W@MgGd !5qktP֚{.-6l=>̃w6/gXzgOKJeqa*6twe}Ϲb.%BCkLֺ6R՞{ZZMFxGrmKWC4.qxo7{mC||wͱZs |L̹ޮNa F׷󏷰{uiǷs$QP>8=>=TC-%k]XZ"Q<}JuDMlEq[9t7 R wvă-\; bHMއ4w×_~Idt&MJ~޾o9F'3u|Pg|8@v5 lٿCAђ-,?uEDƹZ{A"jD66QE2tkQ 㠓&/t4YV+J7?(ga~t B x#O{a34yw?~N/oo9.*xR)x b6OДtAF> 4I]9?ypŕ.,S}4_+o|r,%god<MfI37Kv?^Qi/V S5Fu짏 0]1-)2e#9yq7ᒨC]a/ܺ6`ZBu/G̎T@gȭ_Q%i ŧ5RLśK*|ܡ%HcRHjRlF7i039p 𮒽mߏ×Wmb3/$€Dłj+ZVqGDHU7vC.[kX'w 1TqڧT,BLq(E@SLa6jН7x(_hkf=Id@ukZ(鋖"U6+99/?so_ Dwi[?pםY/NWdW")XqsC#Gٷ̋BՂrZ3%0!%]|>QF##{h?ǸR^*?2xԅRN_oR~+t1Pԫt2~c\#uf?>u2dɔ림|Vƀ0IN<2۫7JEMr|GZiR#D BG%[މ)b "*DrACgd,Tl+pD Ey˂< mNHFq1D"O- @DQs{ a6D̤eEͬxYdfK9JŇk֧X&̩LAos,j:u!A.<&Q,%LG ǤS#D7z. y2>4,ڱʼn86 SQYV$ ZK]hd/%[hdnoF!6RF $4zui%42yŅFZ4ᄒ|4@5O_hЄJr+BrrL~m}ӇlG:$MF\F$ ñ+]qyͺ7ڮ; `0p,HK}~7}'ZdK R]Y/HEǕ^Ti#W[l0J*_ۺ',ĂfY?'?4.ٗʍ i2Իv9|_|<G[q6TBܳ-+  zf"tIR9il(|6L/~Z\ Ls@r 9Q UW]MݓW&Wtg\B\kq[=kSڧi8Q=vt)dv)<6_~|qYP7>ɑ =gG0U)ut#Fྉ9e"a+U|>Jg}Ez[Z`w_R#9G#E&NH Z)XA/<)P:&"Pt#)XzN :;P(LHR@M)!?tYGNV{:YujeѱvJ,KGfRT;A c>bFc LXvYTO!`f1e%vQIX,A36 @4(j=5%PW< ,7 ,=bD`"gE"墫?{<x>BXr:j-߻KGxx-O;۫㵅@Wx6t Y `֯jpFTXĽE ̉ &HfHX70)BVl]Z8:Y22.fjkTZ;96b(B jy،^4h/ ?Fёe":\!{w[,n1dAXdcqK ?ʂ}GQBT!r w_?H7|!TL("t=!򗇙_|흷 ¶OUK3Gx=şnn6ԌOw3 \pxg=yOvaG蓿Hc})~sJH6N^sm<6m|%[PKqIlܔ]<Ӎ7*^;2ܾ_*;zKԚץ\`],|NNL9~3*QI3]tпEmx}}kUh;?Q/d!߹mSzaX~zp>~vfhۺmoӃ`T(޻#⪑-6D D*BHbefwt\b R)P [B:v;SJD/4'hBWNIRWXvJe{avX/ ߸ 3!`z+8ޠ`;V W)d ͟s  #@k -x2'^O^䀷 `O˫j26WO_%,j sc >sݤ=^Z sKoY/Ax̴{YKtIQ5\=# jp>߆E|v]9MRYlP>@|aƔ%hVIN[.1Us}tut}ZO&.>'^fERu(o[ӱ:Z:G;,3D2T䥖vpezkI7YHJ>u@T}DG01U8p4QVG߬-Ғ;PZ];i p?͇8vp-"K`t O4{{ `T~atXTw@!}gq<:?=$uő|ģ+Ο"֤*;W:X:uD2OGΏ&UCw$A`qafN]+tB?ŭ ą[Ǫh/Ԋ*#=9ʇxZ*D׶*>bL*dJZyEƋO6Ԁuݖ<ڞx3pW3Mi;:*s^J*Fu^ qZ3_!z(Rx) FHϫQ1:s:K\*դwC <"̱AEX1Aa$nV"(Nj/<z-rYx\t-aWvVeƔx+Kpx:XfhT#3lI&d(hrsk~8'-3'`_߻KXrT؈#У8R|8-ڜ2/ϣZfe=ܘ":#Ix 3y#mO N绫LI5\zSVvziތ"% ro]G H@&0"D "XNpдF(j`OC~7}:QnuFxf\Ǜi2?Χ4 hbHϕvE Pa/mE#g Ѯ1^5,4߮bYJQb*SH`Z`5G c=U JF\·q爟Q&9xBi`4ŵGS(cB%-/s7g.f[0VSJF'Y6W P8 s"Dv BIc ܾ[w {/VQe3NiJSh 8 ў"x$H# mC>۽6y<̿v?qΦLfK+M+uq$==$8 /+ BzqZ(hG*2܂3qUq:\o!fz7_F_,d_ xiB<({2Xo$[$dV'X_]JBc9uB";*eke" )*e f~1o!-\c'-C62)J-D1oxp|I(3F$:)!4;|@d 2)Rw'*/_NsB#7FC!ʟ{q}6_,̺u0r])vIʅ8N*ƫ տ334H2~\:7`&̟h> P/O lHg`uR\\_e>}0S̻;ap ݭo exjݽ ] 1|T]mo7AZq{~Cq}g- {!+Hweh4\vmt= =l;vx:sۺ`ۺ`~.^A`׵EX0`#,$T)7pYbVŶڅ^0h9R~V[Uw5Vyf^9|Ngwm7MYe!mO{듧z"QY{w*wZ0Tg.4E<^;jt=h;<:އ '{՞.{ԵfZP)=:CaA*ZIfYtZnO U<=RxvoC6!HÔ)xy?<2t\!ҵ.zGPon/?ۗI{voA6Ɉ쯗[ g X{[O,Ei|ʟRދqBxZ_|J_x:/̟[@9A`^@@=Cޖ%0{wŶ &-T[2ziKX-{Q's?@?6ȩmp7^j};뭁|7/w:Mj9'N14vs6nc?gQ~{\@.Eơp8s,V>ˍ_°>lDfwf.rŇedcWN P(ۧKacFrLbB5'F lc<"" ` Ե`qVPݽDwRy\][%_ºd=y Qkwv8] S]=nk])4 YXJqaC yRpT6d bHcQ# =ˍ/z$gk .J],)=[X2qvlb)RdgkK0ȃWQX]gk+.V]G~b>k+_X)|vkg)u• P<5u9 # T-H4J nitvlbMŚ8_?.R]Z]x[`LZ%r1o$s{I.kO#3$h_6Sg`xE( ( .`$ະ8k~c)W,dP e@iiV%X (фp gqz(.u' 06Y:utE5)}"8Lq(<$ؚ\Zu0Ȅ ,ш&nk'0~ȿAo zS613l;56gQ:R,ϵ!2uj52p|_|MiңaN!4[ղe\[6AQ_#'azV zV׿Gz* =S9UCX(yuHI4XY奾V+Ffqq܁goo'tWi`}~((a)7ٗzvׅ(} ?g,cPuɱj:[j=<[-J-=US?j*LaL ާָ{l%<=vt=JHnrx"0Xzպ05E(auܩBࣦ ݟaet`# _0fHjl#=[PdہL MpgEgbQAH'Z>/M4B}~/49EǾ.h;FM#N` Cx;g0u4F8ڎ~޼O[( N^ày~TVŮ!BkkP=)Ed;6i"v 4Wo I_:R(w-СSy`65 'N'[wEPP4Yo=nv o$ذ{P$aǯUQi_zwկWI[%r #_o0Wns_yRYkݥp.W|)Qk΅&Z!mu;5kg8r _-C&wzV#!6، ' *!DY>xj55jX w' Mܞ^wuk_]cҥ;)\DD`-;\=`ݝJ$PsC[(!ڱ\ۮUIAYR2u=2zSYP8փeއf$KcyMȩ]@FݐN"?VvD e"Kp['vk?[02Rua [7!2TwlBJ[?:*zȮ8>Wj"ն `cE1oburچ<^<wfB hp56l{ܙa֊PaB0PCqn|BPڹ-ìݭ7e`si9ɣE/ɝ$bnd6dI4Y/+԰+sCְO5Iʽ9."f܄&b=دf-WW9%0~XGEl!\18;=n <]9hzȉ-ˆstC* ~"eav8Yԉ|%CK+UXuPM-kЪڭ 9su)H>ԊcM0.\1:﨣U17Ъڭ 9su)_P)+[ϣvsŠ 븾vWmUmkЪڭ 9su)@ME=j7W ;h7DݜZU!g΢N3Giޑ3G:c1%As>3ǔc16%SGcXP>ĘەXb`>Jb!ܦ$Lwb!ܮ$1fND1iIPyJ!Z!+w+'e ʀ~|0?^L@q_%ː3HH6`z(IgQ $D{$XlaM&CJF g AEe!i[ %rGGrP(}H~\.o@0fQcU,\4"g~1+i[~,~-`3Z8@6*}~77u;# ~'bq)\m&iFa7Jb$(&bE1DD)Iz~CC~ys$ $ޘw77>8Ic G*aK#9әaILUt^Gy|} _TbiSu?hdTX Ӕg$Ep(@cXeXeTH+9,CYf?Zj1Rƌ2I32H2)#H(MxQL@ 1~%8R,MIm* q+Ęc'"@"+W!0:1ڤR6X8AYp 'ْs# pdc;Bh-3"ex8iVB!,]FTJ8vTH뢮ÎW!Rİ{Org"IGf(JJc4 䔊.Ĥ60:~UpDI`80)4^BaM3-hbYE B DT_3 i$,"?e, +$A R8AH3326hߗӕRXMONi@]&N\1N\IFKJhmOUaJFg}\tOOһU/MQχ~E~?!ů~áj}3Df ahۯol[fr ųO(_uTr /w<_LJj oO>_KF A4cTM'_ur;1Z5jkPE 6{hXI+j*MaQD(Q V:&Hh#akD7Xj`j [C_In`xI~?NK~Xii |@g/*ʯ t,-lBP>YٻFcW$62T/`xAlgabW6EʼzzHICfKo}%˫D|`,F̘E e-4fy5쳟կW!3UK9jIQ%r̻RKOh 5-7]8GW7uB2rL [yE'ioi0Vn%0.Wa^/!?B;fN;~dƃl>Q'-4".)Y_*(Jae)=ij^XavC5 U(E`T86WLǷ&߼waYlrAi=.([Dٿ>-hQ3%CL`i‡fzzJSo[w&9s2oΑKJ]ȅżN&jaU=0վV9X B*X< %z{ KknS9RK|TI9C޷QVH9Pd<| s Rc9uMؐs%*M)۰rr^pk5@N㧸@qY MSih*_5(t9s / X-nIg_c{ۇeln͖&.h8PeNE,h},̣B TafKyB(-AJ鵐„`Zy)51"i ᳕oJ1/)B]D] %rr1'Yx;.83yʩoj ^``> { T3 ;33ОZb n~܀(R RvUQ_p?*, l&_!~3bhznn ?QѷVS-,WlxѶa.=8b3;r*}lچܔ /Z+H;(ahf}%y҆r` ?}B9)3G4ۀh[ɐT J*q:B|aeڭ=~%Ԧڄbpgg>D/ ?Mm\*|væIWUÛW_4۩0ؒJ  wւf1-f1UuEz_W^"a$%*ЂEL =*hwf]'3(a6`ơ~M-D#ry:]}Mb%BD,V;NnXJœ=&#P;eO,eV9X,~}dod{%%[f)_E"5%%Cp5rԉRI/֭C[$A>)\8 xn`=w D#I̞>؁sݬK٘bBJhCADxc0(5cR%ah.(8 DQY,$&<gpL\)OBys7;YWWen˲/Λg[d5ɆR"]9IѱA󨤵HH8 /brҖc*|p(` `b"ZB1@M ͛ec='xMPԫXق Êdw Q hy|ȻC%C_fia>P/,;Qg*Fa/Liᩣ@{+N|_u~N]$br8Ӥ˺/:lD%PZDy$0"Om1&4X``0bQ@zGH$eӚ69K B`2zXD\K#<,HRw(ޒi}#t#٘o+[vcZ1 ܁ݎ2M7)dkdTy8B$X C+!!7њH!iyElٲ&*d6JI(={8B7OZgkDÙz'E,Ɂ껡^ԗNw^hˇ6=/%T1;ll[oճAZn6"I#yIR<ȚZIQa{ګ;p0Zj݄Q͐ G6jf6{37s]mDCiG;{b=\=9ѽ=dRq-[oiңYKf̵.azS޹  ֧FKqǗ(4ZVtݱ|D$'TzY#YM_8Of3H uX5ʕ.G8} ۊnS+wUT@?/p~*\yGrrO &I*OLԮOyWr~ht]..Ͼ ׋9&rPR`Ƌ.~dƃl>QM 帜b ;/s@5ޔŲK4/ء:;!{K.1 %(0^\cr\]r׃.̗c`bYdq!wWqF6D)gk!/l5l9YI2Dbr P.'M^)iv=i\9zJ]=J;wfb nt<2k3RY;e2CD 4"f^x ĊȲag]~YJj.o8d W r8~+'/nTۘg:"]f=1H4wFCtXy!̹v"qG65D:P??i&,r~A!%pVSύsv vkU#pn!,ߘZrtfipffY$xY/2NI^.!ߌL'Ň`_ i) ?gB)2fWr+qFP/l$# Y,dvGRu3ߠTGje*C[_d0Uc+qeOo r *Ss[f"vPpp5؎ NlAг9r\PdHa_G㲠JE$2Zf{xyVWV ֘vgM(8ogTbKE[x<E1)P8"a"T!)_HfXZr&yVlkC㰆:K8R4wQ+ߗt #+d8MtTf2P >%ec)pӱ.Ub}{%oKCu%J {yAwKjaī3Jof7֭] ;>٧6>ˍsv }scu1'Y|-@ Z,J8XC9F+ 4c9 dr1[ tF29Ghdr7߀PoGg;D': Gwh5@ ̡jED]s?)) uXЙ25-QW)&uO*RYgRYOGW{hJ{KwXقK$ evO9 e ¸7WvmxiT?=J:u9hQT :% :ŬWP:zūH`eQ;׭ZX--/X %PFUZe^X27`T`rEmr@dٷr-vJнW!i}o?}(Z,*:L? ' ב@!2{, 1d^vbP>3NjBКG}' v"{xB( aeX3 L n[gKY1N©bpQxްKFh[XkʀMzwH}:7znmVuqnBoN+Dt$>c!e@ʁQKLK&-i $JcX1seDo8\]s7ZtHQ*/᫅PR4*uVT?o,ޢ\5W{IqU*&5/$էﹷ̿KK ^WK*M4yqOVs V>{/53f4̘j͌qaQ#@dQ&Ze1ڕ)1\PD(hGi7?.Zww6La=VۉZ6bQbp%\lZ,b7{r݄֓iJ N`qh.de!rKnh)KN8y6Z4=_kZ}UR:/rԹP:WvP^: t!*uF<Mj;Q+-vݡhG5*?iVo5F/q3:{y1;M/f4V_̮2*&Ja+e O0F"$U,U0EP/h9Q;GKs <Ts4 3*`◑YI5u MD442$ryKފ\XHhX"k똶1m'?睐Ѧnyأu'1W%UB&E]N K lf[䠃L&$PU݉S‡붦m( tɔ-u+bczX{5{ԡrۘ5Yk;fgs`&ӥU$De) T:G(Jκա:Ty,ZQ9ۋv8[lpl:HPFޯ'_r4ro ( 'ukMZԣ%hJ1_R̟}gqqSTX۲"}*Fuh9o>boà]6Z|>͖ z^.͖kDg]vuCZ˼ړ:? }iA /Z{FYEɖ`39[))vW3Q%k`HHYϺK-v8aP?c^{,,XY%u+8 NŨ͎E hqzࣸ ~EGsphT{Κ֑8w^[5?L@k-cś">#m$Bf[MVm_s7-6c} AFl0)GوUt?WA"f-UY>X Tl/ʷM+*ΤU}&1GlD !󶳮f!gc/H].YAA_i#)8bZ#)=.F,P\>91AN*܁Ce2"#Aa3 ,҉D`y˲F0*iDSns\2acHʥɟ+rfٲ`yvS<yKhif)-]?D:=w=dn*['m9S#Q(#A俋dXܐ<[G/e^ΰXwJ8pdT)̖RetЬJ])l`(ՋEps4v{3( 4olR"?DRb#)&,&ɚ/TQ rR#k32"ishEeEWuJ9Cɶ\+JQ.p )/`$d,vbja?D}YPdc{@,hvQY'.m esO^:fNcL]qKS x:iM>\,OD_tWiuIC,Z-#wo_:F8Vq97LO7y{Z e2w/?}s~xnwNCo_ݾA8~`M}Hn"ycpL 8<34Pi[;[R 9MT"8׫3L]c@^jZÇ#]''l$l.9h(0NoZKKՈpΐ&GZ {7JSИ^HkL'4b ОV ։ v+Fz&W?崅eDT;2&)r͙hv4AδXm̙/1& *ќ:97gws!GxW=dW`56ji+hG3BR驷TJ%5j)vG#$s;Tl2Ʉr"XVH>c {B c+%͗19%p U=RΛ2Ɯg@94xJ;?[҂0aKdk[ m4.[YEMmbv-BPаCe9l35(h9hjPL@,vϾg>%ۙ hQtmHz# uϒgHPT`J!Z4ߝV2ZćՄ?b3?Ӵh͉_\0{z3_}gq gO-@Jtf % Z:<<$T\kOe|Wtr)V|&!1 jWsy@mL7?[n`њugZoqC|1% C6VJ''iӐbTgf~Ooogqw׊*X~^NT1Cܤ#Dm7/ 'dA#K*By4{e$h!J1uhHI!ޗ {Ω oI tͽ},'vºtV0E!6AΈI Dp Y*hh{8  ፶crl|ԅ[ 5Mq* cxa)SuJ +=:_FXz2\ e2sҧT޳68m ע 'B)եΤXR$( {V5M(Ăj!]m]@88R.9~6퍞 I/n"%(%yp"Pw#$KnIzbuje>RdqC!c]bfLloJ\)iZyN/ocd.?)=O.)zw}Ԇ +JxDo~-a׽[SyL׊֜ ׽M|V ''~5E+JұR}[и7=.Xa (2-6ao^zNs?˞IHSJryU[;?,roo~Vop[i'qqoZ-/{ZV\kO?>=.Z(jPz]V"ݶZ8464G!S;bqlB[R%+r$jKw9 nQRZ*2| )r5yZWlL_NfQh8`V2`qtBy۴25K'bۓ] |oJvu^!1E`7Sʴ&38{~omhwff t$)s6H/;OG7 hўڗ7#Ow7CS_*+UyDuˀ ![?$7EiL([b`u@=5ݧ'Rv!?[/%8U7{k2`S=ɾ|4ȾZWChTO(@1pIx.#p(NSpK5^tAG%~MsLH/jĉV[% x30;9޷6C3^t  &ZMʑ3z6iU8H5ޑ4Pliʊ*릙xXui#1ka gBGVjF6Zu诵ŨRg}7&tSoƸXqo$zUݺp+ѐ+ik(4SN7 4$yfsى0wd ,-uwbXgQ9zJm+;I&3hYѿaEŠE^'xR*013&q& I>(93'ۚ.:'gÙ4XNՊyaȌo qZ32JWNV|N:xAiSp-"efU&{41/;Q܊Vop6ɫMD[ĻDrY1z7cڦLؿĶ,U ~<)Pߥ=:}?]NnO]G0^r币spW{0vz#[*~?]&~4}Qwl0ۛ-7O_.Wot7^t?Al?d/e&pzsyjF bݯx?}v捯Y*h4pdca}9Bs7fȾ6c\Z~wz| >h 6dRۓ9`2XY3OLJ'-W`NIL%;XD\ҏІE˔eǤO?h7$Or'K MQLޛ?n⑱6u<>{ ]0:ww!ǥBYwslqzoMn i;?}otk_|.L]G=/C];(SB.;et~"mw<1[,OoWлLΙdOO axo0決 ڊdy*ڐ=%g# fJz֔٢6Җt&$W`R=9 p0XʧJhzH6ۧ[r$i/ W!qĒ(d# 9Vo7BF/| ѹ O۫/y5K?k'~-$9}z׾zG& z3[ ]|=OYf:&z7RMƿ@d0nyCϏ{i!uwq vЖQ҄37N$kM0RG0?}7 SEz`3yx |$s"cⴼ7ôR`e^}`'yzYLo#u7$5?!'0I42I8>_ۡzj "\D?m|pD2XMG.2~\P5R\sa _T&WOgX48=g0̎>e8 @bi q.A ^\Z k@25bK&CZ|Guc&!V쎡[LF;n;VQ{jm,40ak;I6;i+E3-/C6`;*4CkF8T5鹩_0M?VbDsW)J9Bέ+C;3dWe680]1s= 椚!Fu,༊3_MC@LێMXhj\Bx2K%%F8)Ɛϱd>T/@D~1J~@GӲ/ʅu#ZiY1ӑq^h|mBO0CI|#=0lqRRdzL{N_s#r jvCuq=[k4r2wM^(iLmMoo7Ŷ?[PEwfDLt3..dUE"}2)KYŨu/"6+!JHVާTZ B߉.:j{20LyYT& SLy./|.+SS |i!-,n]3=]& c{\`CI BCT- =ـD9OnIܒ*$xV\׫W׵$r$U I)?El Q\tZuwx\Le*yK UhC}YsɜVvSُ^7"sG7 a VB'-t@ P"sq \Rɋ#7W Л>l$ xf M6 YM{x٘Z_kZK |;ӾWb9h+x@lFNX FMcyӢr ڠ bc6A6Nj68Nz8Nl=/G7n7<~m)'Mm&ySZb&իu=ֈQZGU%|8eds^I]j=k##)]X ) iRZPh:kI6vՁ&r㮇W 4I oky'9ĭnB&0j u'CvcN4ӯv=9lLڥ TqFiP-ܖ{EBj>kp9ge\$B7҃{G 3 ׮TBH\dթVKhAJ1ηuq)RWѳᖊhpK 6ծѾHXf1  Og6Ae#DbMS%5[nkT1SYȄdSMX}Qj1ȥAj0}¡8WC hWa컜bx~HM(u, ;ʜ-PBc@ T(n9eCZ) ۖVOhA( B>kQ3ޙC$u"r+LuJ O;å~>T |̦6?si4h3&/4`?5UTG!*ttHJkCG$6RϱK[<>‚s8ͨ isa~}E" cC7#|GDq" S2a c \$`C9ȨyHH J#FFkO٭! 1AC}I1:B"Ae@E. ,eTړ6^([oo &" ͚0 ^>]PӌHBx野JQ4==hb_CF!I5tQpGcͨyCMyLB8„GWZ)pg4NÏ*||JZp `@U4z)Mb ߽wv*S{[.rncPËG-;i'2s.o6TW ӏḵ >JQ̈́giK_yɕ/m.em.YݔX`;RlQ(s] JQ{NWe(%s6be2)춦j Bhg+繌sw#d΀m9#HuYޅ p[kA)*sVwTU9{8v!QD^݃dt,%EO |JHZ݃a|/{ a hWRGw,v*AE౼.@Bc S\NZAv >tm5ŇZ]NJ6u׌ [OanU ֍7rkѵVBN_$t`Zr8%[]8#&.jWIXlervId'7ա]ە ] #Ѯl4|>yj*!IسXo " NTķ,N;AHjUpډ >Pe y`R5$պ\Hm_v",ƶ0GD ,IyIZ7i)Đ֣xr a)C|zBo, yŽZief8I%Vjawxl2^ą/, EI (j#o8F( | ܎8N 5c |9Ow}9i@p60GՊyn8݀Ik8_h qXf^>4RtN3\harbD,64[>Hdza1A\06ݮ\fIFwm͍HUv2Vc[l2Sq&-/Ќjeɟ.L/@2%KI$0 8-f_N<5,Z}^;(%VTTzꔩ䭗`.L8@ AU[/-ޔC}ɸ3=(D!`A`CJ𕒶u{NG@x ]Nyc/ g+,g]Lp `$'j'Ftnf{08]_i]?&tTfΏR{fKnahB*g/S݌aPs ^&pnxܷ%:4M'?})g:Aeh,Xt:uUf\U|YlmHGGY5B"E3K(%&  KH8^.u7'fgw4O>x9l8[]8FUJ>pjTN߫n=-9Me_؇5/q0wMek?ښ ypӢG=t6%:4(zM%wyr38I{:=(N"8!]t68uwGC5<^p  V;=gjK%" s.q:VBX 0J!@X&HP˜q- :^۾"%5POּRFLJP1b*J,HXŒXr".UD)'1- pUR;ڌ y+@EbδPF.!NNt;H1rX1$HŗnoW`0~VjkuCb_+¡ÞA(R w)aqRB"!4J1qL)81cq-ZC ,)S$ cm d@8#r%BD1qX-0=2am]axDђQyDâ?_W?fΖE!Z#mrqAb~,ֳOȡ ƙ\ IbXzwt4_wnD;:mTTVbYc Wd~{a~`rӠ|y:WGz\y3iNyҙw[R36̿:n|tbr6^9Fժ@2~xS83F!>x7[ twTnݭh jҮuCz&z_LgL3su=}ҽ,pW_fD+i-rǞ5+ )8OpIQ {}.Awr^]e8 E›P݄72( G$|b2Za͆;o^N> ֳLj4hGKH:+Im12$WSKw}<d׵>c|He%WN_Hvp1@zy٦,";~9d\mt=ݡ!@qN)tpwِs 0m|P:D~`$ut7HWfFl|5_P rEs w<(bհ1OVh4&)6 \~̈́[d`IR~2B8ה N>aϓY2; l[G*v~̚yө%5 3[nP+é*>8~DBsD{gY]hxuq)v #x'GdSKG9$>5XGު5RG%>+6XJ;􍐽C=#P - @4b[mFy!=˿j cB jnWap!{V^tiyz 8QDž}ӷVm;SDCcͮJHrg8s`GMcjb?uE:AQ*JNE{MJZބoD K PznJGJ脐!VznFJϕ(y9"$mNgH8.!iT ! [Y̩Su*)-~?:hJ W2d-*}MU'H!\p|ЙSڵY }Pl䘏K$։}:!$΅ -q8GҎۻx3Vv%XbKHpyN/D9 '/rK7RԈcū}G/sWmWqնJ@"( K@(0NqjcNbJp Xa DR1s.?7s4T .wy} 3y!oޡt-etylQE:#9gIGO{%!.ZO1ԪV1,K-1dÒM/I6fl%&D:2.G‡%b,Nw,˥󨞑+](4&ybE\cݽOEKYvR>b% VL0,hdp);'#& E]lEFmą6VdFP0DLl*r# nqAsomώ.ʙ*,1LhxM{Xg9^oGNm|RP!դEQh +5.ۑwM8tҫt#@mB"}s7E-n 8ۢPaQr\m[_MֹRC§o2 gvhqti=M-"x*Z~c>Nb Eުէ×{fH(Uw,aWaK%? GF"/V\1D]z>RN\>?Pz4Ǖbb+niǎ܎]3c&5;|XMnK zQ,0Lwߏ.Qxu&{aWu(@nJ^_# 1 ӋgA wlfi|1Y}۟JQ9ĜzXXw:nd^ j `#&ѾRT"rgzE 7#A'Zj+Jm2`;Fo`$E:Tfd:ݝ~ >/D5K3UUjLsk'c P%QCoP+2U :V>'.pA*fw hwt6sܢ#/!-]}\*bfI /Agܸ8!G?{9v]df"|M%rۂM%RXIvJn[zli{-Q <:GhL{URI泛UG}jlҺHz>}W'xkm!ikQzCA7]hIٱtfs~>\aǖaxb-{Jw hAQݓM]8 $ofoWPK߬̚/\ܞYϿ- M{Vs} ,(o&/ln0x16ƒ i:! SE|uc:^,dzި<Yc ܿ(f~`rӠ|y:WGzVY<Wt.5>6d>vd1_v .kחyO$=㭖g*P+94C]ԢO1xmhX4?h\hRb+ Fx9i\Vc\ɢY%MfBmS }׊+oEB?f,8b'3Nv2䄗~G (M+^(/YD]Vd Fm> CT8jz!kTh40!Zտ#wJI/3@,E‰;X攦RvLf ^%,! aJ11F RIR&8BhoOID)8.`IVD 0U9?͢xȏ֫/(MvUU2!LF-Vq۟.(B\A\0?|?x\?.GUHh^ZyHy*Mx#CpD!!acJ{G_!Af2陼/30bѭ,yL-RS #vSsnU[˹0Hm٩V_N5h6/_wG"O&La$\FDh6T -(ǥzn,x2.J˂<0vQo*<^ ϶ _"Zww{[zJ)i=E;sOԹ,R7.F3K](=p7b)ĴR-p5t[hXբݎ$q2.BQ6a'r˅u) .B5 eޟj׷4W2eüU\2mz'{;8Q Ws|N$ -lwhLe@wf m&$597}k[EK#G#7fK Kl9WX3'<|RDb\Mզv=jJZ,.L`F+sIJnFK.^G>ZjւSP9\u3D={vBҴ{3tflȌ- *GиpKTXB$\)T7Ub؇58Nq`ECn'{Y=l):QQe']>ԡ. 4eWwdt!HWOBBZ!aAҁZq:w~N֊ul74Z'8le@񧕝Z9[Q<4 ɤ-oh}lA`"UjH{_=.Br/zc;J\ _x߬t6 _'y}'LU?ZM8ۛ(bgσIqg?;`afD0/tefJc}~Hͼ/ʾ#5}y4W0,DU݁16]]KDŽb;QnCM~0%g$ԕ d*U!.ZاNZ0lmq?~?4Zx-F@9Ny4 K2n <4dn )I9v401|twwwkvnݮGFf-ҳ rD(h[Dp/f2_$vArQ/H6o-J+TiO#|Dm%-Ӈ?r3pk$29O =P`g fK2ٖCK&=uS}|ω!>EO?~i&X8 g0^e5"v8;W)&"‘ʞΎ'֬'* zp\ 0e5 07<!,tW}on82oU6!YG6x9oI̟&0R%.ou^X- ke8& Ld;F3b ^}p+Fhx.t(/.s%[UrUoBk7e xwK(@}gъ#*xUK])Ce+ܵ ӯHYG 9RQ]JkY^%?j)^9_(/AdqD#_,-C@3U1Y"i QLڀ # ډF+JlVÏ~ӥ MHq KhdQEXXsF4 5a1(D\H$B+K.a)@!2y KT∑ؐHĚDX (»XRO,i&zVډ6: umyͪ%Tr ׭  Vj+X;äC8Xq,"ddLOR(Z|4ݦSpP4:G5e*]Ćǡ* `&dWX+l2„FCi悈!{Izyt"BST P߅AJ3RGp>c큏ㇷnO,/y~?[0 " 9g71嚘O|By!bZS*YˉHy兀+%]ok! sx2+Z@toiK0ֺVHD+r0V F5LX+p$ bUq,X U(؈0b!c6d&!EY. C>c}܈'m^<<@ktq=#êT2._&.3vt`w|>[[9g^Vy"o#pJX3 b1UEP}5GV&lR^Kuߕǥ<( ]) DH59'>88}3C/b]/tIِN8Z'ENBpmwm7!vσ-G?`auz8tp4[-<}kR5ž)~n4t׏&4# s|?8.!$R#U~N ?aT Bv. 5Gi@kP>wN•9 F$Diݙ0soG|p@/g b1~>b#В)-iST9Q0Rdh:OL\b\ 35"vGP!XSO%! xb:YK`ʅ\i!_6x3pۛ?Mfa0TE9U>o2&b^lEnb}̪x ٙ5,1 ]AR,ad"Й= ")X@ȭwOÚΌ`sO+|џZUAMSVdK] 0mzt}ڟ=H4ޛѧ|^mi/_־Nudp{p:"kE1 M6ӊߴUPEB}dƜ(iu|u%}xBgsD4X/*.4u8"QlgD5UdϬ1DY] iTE iVC}TFuM UIGaφqt^1gſ>l+Ĺ %> -Ƌ)?R,E]1菬GO>/Z׺FŇDTyg>"UWj*aF0ZGӇE@'EZJVB^%7}D\*G !QWe32(,h-sIS!ILIe!ܲ?Q3;ys{6o, ,e|"E/URVBD"Dl}o>CL&|n\aދm]_=b`l孁",o7sFz#xem 3-j 1:݈|BIV&#R`TKK TXL5X*ǒ(({a4CvG ƢG WMX܁ȇ Jـu_5|zٟ?޹~vbu>Q>tEkkBr-[OVb1K.j?qv|9z`)W=uPm+4VC0DlXNܺ&>ʷ}iQRg@JV[O9o>Njpwх}lX k@]IƺIAǥM$^ BvI2rT61Q%ʀjԔ"͌62עF3߀fu5[ĉjVtGS`}7Q`~9 v!~7(鱻"`xOe9hó6U@' 9R^Qrze0OpY+rdrq?J/M_x Z\," 2fQbU/H{#*Im!搕1Ig @L+p{7fVz"Y*=,$o=!jW?~pNGSgCK!UkZ^Qu@J.Dv?Rp()[.HT('_P&%FBI'w;w;Vdz{g[ Tu像SiW0~ëF^6?ʇ/7Iu&ty'cDVtՃO4R*XwWEIK\IgQD^R`r)gm^IX+G4mݸ#"5D9 C-8IUdHF݂sRTE^P#ϱL R>V9P34RX`e%K>T`GJ`(EPBuCna#{;4aD_5~l/({@s$BfN0ʻ9' ˅j,~aqʐ Ҵأ7"Ot|F0; .\ڐ+_CD;]R֯ ׭爰ZF-KV*/A|IH8'9׳ӑ_xA_/n$d{nbU$ xi6E/S U8b}燺NǧoԎ̅;w4#żUeů<_/nbU +H=n!i^xS D~O* _:!A  ^WpHicqGi'($NSlzrDtUo'Ek2˭7 @Nu`(,tO1pQ.b>T#aƒ a2]k5=1ZOvWaNf |"RC529bFY8XbR ?s%+-eB(&RX]QRFpTIZa8b-7<ˌL~h]b1+^@fW/_a$)C#D4E?NS0G)4 p"(3:hvV@1om&as=*h8 -VL?ޡd$&RI#`α( j >MH S,\b_I]PZP&$+iq~Jo>+;$0. wt .ɰ 7PI2:ε4TZcd ΂A:""MI^h)=y#|ۏD.]tQ~ɑO^C/K6v2Oa<>.{_XL ی3؂s4?o#Xrza-"CM_1UZ!a߮$~/"{VL0ų9, >yZ4` a>{ Y7ܬ@t~hWQ:UTmuSۡu Dub(cݺg5PV|*ZY4=gdce4p(;L^+0ӻQ6O7{OhvW%<#Ϲ?E?޹~v>hN >,6$]O,$pY/F] [LǭZgq[G\{zSSrpٶ( "^[ZAa!*6ei4G*PFB&RI)N!`8Jpb4Dt@s__ԢG^qDQ:е1 耈Ջ1oC2P9ƘF28+0fI."É&UAb.5Q5\Qc% I&YXňD暚'BZ$Rx*X1ϰ8h@}ɣ?8oڠDH=BQU\ RmH*abV.#pb5]Q0?QqʳLi4%QBjiR}&fGjIEbFΦTk]4+W(]] ZX BT'2֭;wsZ8gݴuhuCCr)MOܶn ^!ۡu Dub(c:\E=PV|*S2P!v avCp(a7hBN3u"ɤ3V`׈kRtFJKKJhXb'ZGiy@x4_ngćrB~4]h鞝/R#*# _ˁ|Ls\?_>_p5q'?|=<] "b`i3??^L%!ye>ŃNd:{~g#<''넀շ{V)Wo}%{^ Hȍߡ?7RB0_ '%BGn6/Kz-zm3vDuar A!f'\ lQUZLR{}p*W]&d:}z|ȓ:.%JIɋ~)^Zg|)2||lPC.G.R||Ebtb-TLNzpЦN(TXɗ"3=̬ɻ}:KBt:3*v2(ZdB՟uwƸӕ~ry6MFT+-R\˜lѩ4R24MKM(Nx>YF:mjYBΞk#z|-Om929iKc|704J9i8(ńFG oTepe.n:-X糧ÁyS޵6ndٿ"b'+ޏ@ &n|Z INǙ?U,Q)U,y[nN{ 50mPb ^e!e(+L(u&۶jJpsl` 5S;nzҭuTtr)/Hu%c@R̼_pBl~t {Qf@g2=Lm%PfQHh skMIkH![Trء.o,sr3ZmĨ~/wM/@h]+Fxa/VvDYbĤ'RQCdd?Z2jD׾drG3z^oP}ʚw+tƼH"q ˳č֥cfJFZ_n5_fޏ˭}pbuQaG'[ LslD⩿nAq k%C6<wQ+$ P}sZb`}hFeSޡ{&k -tZ.Йxʤ%o;LRYdVdgQ3)@\rE!2ԇ`<",QoOgOs3 q@Gْ+?.כN2sHBBg./Oc$RÌpWk{|F_`w& u>-ݳ6ra.:vQ]t?nW=t #e>@G{E x⡗?P0mv;|ط?k)s uI&9n ?~πs7w:Nv8̥ 4R&bJL;ݵ̫1ѿ!! gIP ȃ1h(iՏH F @#F $D'2J@p`EgTFG'kg!ܩڒ C x̧RrQ v]j4Y}Xv 1d 1=j 0) 1t ]% TkD ܶpl=SPdM@" ?EJƃ ݆U wnGhU,*h,xtωa VSZ`aX \tD򇾮jS"^FK(pWߴZRN=Ҽ {`3UNl(^a(< &.BJW:Ȇ z;ڬBq'2~z-))aeOJVijSXRMm j~ tD3fѝb/|.[1xŹ:}/`>%?gV,X,0 zna2Lrݷ:6>P@D-=+kzq|'=SIDtBi'D$x`' 9  1# Lrĵkm`xoHHŵ0$q ]cNL+!aM0"ZJq鎽핐R+&);%~]0bF>ƛ,ҕ0$혡GY9UdPQ49 F|j^ 1{Kv^7DM yuYT\ 14zUŮ"0=E)-?v =ZK:Ī sa>dQ,P]z2U#kIjCƚ^>F,ZCYʁ`N[°z]"j?O vۮ+JҬ"?,G!UZtm Ңerq \r1/C6Sxv3)c?LONb0U\8v݇yF ?L/E?L5A"Eg4 i@ֳa|e=SЏcj8{:;gn!ʩ=# LDs>A`Ke c$x )MR{% c$$ P }#r8((X}I$h!a IR1Hb"I"x@"'!C{9ƌ!b~Lr1z#4u=f1-@!9T=laCK *%W#@4J  Gܭ죟gOvb&\;7绱 f'hc)d?;O;`6Ę~g駗BٖՈGOߩ;~pcgna,{&zvPÌG+ wCkK3 %wy8vwQE՛Eno1*E(DyIJ '{Ď~H-h9"&?T?^]f]]ٚ] {r)<~.x%$B/{ä/;*SOvdE/cJ*O vN}thÝ_xR|K|+O |{{Ks6MHH&qK*kq2D q/ X}#5L(Ǵ@ Nn[ewc%84eC(^1]vfzYt M !חq{9Q؏R3X(LcGƄ Q8:As'?~;' m ("_9Ѷ;6Ab.uM},jP}m!vfm7"u"c"^.7Muq檵wLelVc Éլ0˸M0R&o^mʇt_nǫH\LB۷[nC #h2U2 F\7m%}tBeƛ9-}Z;cݑa1S͋jqNafUiJ=+n!\׎|.jSSט|T_B}n;D>Y f/coE/)*.,4pH %@B߿ojjnއpA+z~ e `bQN9H9߹]fmDž̺I\xtϪ )Ylz)-@~ʆR됑gpz,_^|`JJӑ؋PZ[tD 3Nwke*̻w:s>Yzxomؘ]cS?gkvV[L sKE3ׯ @xF։ ⺋ -=ZkzZW䩱$8N鄣(,d"p 2N;ЉɃT#W8ޫ%<)1RkԎȥۮ_<'R1)v{͙1s01tN#b/ qwdXnI7bR\%7%M Л9WLTjP]z0>lNc8c G'agʁ;a^ |g{jNxEW!9pI0d>Ŝ79|W#_{a0ldN tÖJÔ zYhy5dX32ä)t0S>E{-1'ye%Ⓒ@|;#-v\ pM9rBȉh h6)!L0oc@$nz= S)8'_g֓'kyy-WϏۂ fҭf\Hu+uB"!'*4iT%ǽ4TTs9 XJP0 !()$H@S $yM|;F*34)1/?ߍw֛-^sc)d?;O;`yk*YnaeP嵁ei ;`=~4%$PuVT47ZrSɕ/)ij@n3b#:(To3&q<-j:uϔ-xDɍo/֏t39*ĵ3Dwٳ՚/d| '_y-^[!ȢJU}/VB UlD_ȌPhRh#00 RTe&, R8J""T(A!Bm8moCQy*sԠa>Y={Y69"Fj̖kfQ`~)# #.!t EP>i}9̹!Y7g.G/8 V:DD"~8E8FQ ( xJ!8,1CZ'so&"$BQ0bhBcDrAR'$CRDhST]n*bB5o=X_B _|~T3 "zưl;D#1Ht^plL66/lޔZBg@{pG rY % ϓ5^.c{>t#HǠpK1s tż.sϣz :akL+/j?)la{EQ+O,8w9S(;0V"G=LqC~&Z/=)Zse7yX>}tֲ1ʀ<ԎsiT?RHK jh<*GVA'!PՁJ>KwNhݡ. RN\3zVZ~Zk58qj`ɶ1PInyY-7;TfTu_d՗A9&%51D/@1z_XȨC\<%Z5C*C6uV\ᴹU&)LY^]2S"  [!]!mŀ0.l!*Z<) >D~9[Cڔ" `C.ڂ +}- $sIe h!_ %o!v 8kG{!lB܈I;h.dTf*3iHn2ekS/ *tqK~B}{"%cì^N)$?r T,mr,o`H7#Вڍd:PkSIjX-h ``#;Fwx/ǵ0RKROJwDIkl6޾nh\GL_>5u"QP?@vk%"٦C\^zS7^;&#nM="XI:fN9,D"[g5_ktPiQ8m6m̗PJ ZE$@38h`Z]w`iqe{xo'VXv٭BYC+YРvST`TN;CܭL.-@*)r4ɗ JA]C*x PovZƠrُ} /UVZ"qKI]:Ee!@^D֌"N.×a*.e0+?>0Ӑf/n2azCw&v=Zm֯鿵ng̛zg O"$&:!跇< T煻T ,u sWVF䤮B#RO:zb-"\j9P˷P8,!Ӓ ?vEI:niL~`J3|`Jr8ґ>qOL-ġrh}+fʷמI$Youw8yJchZV8EO/]뉌EPX,x%8f[s֑[D2!mKVBt^F! ԇ T0C `ڴ.6 C )27K:Pk.7C=Di(('B [d~zڅUh0c0c1ydHQ]'s3$JU*TwTz.p#_~pALJKiOGN#yfaWtGg~Eh_v+EO_7BP93 ;KWAx@ )sZ{U!np^wM p[=/P` g@@ՎBB &Lrd.ԩ`!$Zr6%02|F4 ~F^p>!Q@ ObePDH!(W DPϧ%fQ2\}:ŻO.bDq;>E#RT@b7g3k_hpݜKJqQyYUW-:H% Z猽\@s͔>L0jsvr/!̲0qKˀ{S(-45eԔ8|7; 04od8e OaT7[wƼaN6#^o~֘jtdHc̲J7F mP7ۻ{KO}lpG:L.il|: f `rVoJ &pErGj<(0B텸R486g$m ]0DCMH(`@҃498Ajvڬpm:H  iƅUpHCGDF؃a*y_Jf`GѓM„!bGT6`VrVCB/uٻql χs1| Ãdwf4[7#>^ϮJ~zGu\z 17pZ%e>]j5\O;%}xUݝ+WߔoiϋpqPS˷Pi8,v#{2EHDuyV+Z^fϵC' :L9/%j3$#DQanZ'Ca\{mc%z;w'iw=pc/{e_Ծ̝Y''})!XGdOq z=3$#ʉh`ޙlX]u.W*w"t 8/\*.o%WkjZӤ$v,AhEO'Ft]E񚯴R(Acq|`&i:Yu#F{ aضX㺢R6'Q+LrdAhS^oU6KK5>i>8Ɖp5le eW%IĒ+Qbe~Ʈ⑥l^o\#GJ 1m248} E5xxՙ)@2@]08T`+}0"ETF.`Dc\xI $ I/Ԥ QD||C>:\!_ ?z}Lu4]05.i>'W;=;Z}4pʚh#= ;?:})2O~; &#z|x{}q ӗ-B%b_oޏɞ/-_U{?8^gvT㵀y)N'ݯPG]T1Ϩ~C8|(ʐHr$5Ykٞnh>[&FqK) Ua)u7\k׏5Z*1^~8U2~Es6r繮6FԢU.('-"w6njɝ2r^di:O ? .5tr?;%zt:"KOor4֗sV=w7kkQe ]#XP0e@ќ2g9ko犉=369zM| nGk^l)˜#-SXvWT;Z-%jA8R_3cG&QI8٩ofJ͒&u_Ǽg36t0竗t`oDiMV/ZލTٻqdWTz9dUU~NƵI첝ٚLxmUdI+Q@RtīI%}x-\n'acT,aFB"}^Vtedݣ!im!/+""1[K\$y /rBL0(A2TE ~i:+mS0tFr-"t' YVvW'Xh$-r,zcY΀ȽRJnm qdʸxR'-QlҔ &!ٚ/$z{뻕Jڑ%-D?r\h\tbOY3X1%5goqڪ/E7QT`c4*3ķ#lsUۢAC(C jJ,B ӕ..UOU\MF Zfjut\wS9#u5017Yւuᨮ&cSd@ !1dM,&VE#rP2d(?b"M_J-+8aE6S'SQ ù2CdČpdض/1{ ]l͕%(*Dˢ2N.` 2LnΕ%8g:W>lr> (4 QVֱ'a_YEF'd|7.[Fn ý[P'9:jnFj_8j@SMrs@5'i-H6IQWx⣼XpxR\Aw]pRkRU)!sgLeR@^8E|y 6 iQ;r^RS(.`C1lcJYsBs}&JSce(z}zh֡O?E@20)d$!"JN^nܨ^vK˃$hY8>WB햅2ŕH&.{t*lNtANHq"Oc:I;*?.3Iŧ?r8a&- 7 (>=?OK.ڝ܋%rG1T+P?Wr8̻3ȇ4`Cd+ۏq,[51ۣQ({AXW,sR!bm/'FUQ>xXcGtYFr<xLߕnlYU;ӓH5\ 6•O r^o,^?~ORP!(SY%n}ٷ2ܐU\7~ ulkxU7!^ Z6orM QS.+AsC(tV~_q@ !ș J CD33h+2XQrLFCr_(>-\bJd0uB:Ff[tp;#b0YsgM 22X}  = 6Q0$!LAfKF,Yh^hmv,싈y7KDaȠ\9,쉈<uUf^L z9l 64]ᓄ MH(@18F ~KѠ܇8i˅Gg kZ(QWaHe8 `/ }"m-m-m-mSD'NGR/|Fl߶ HWr׶=h)wY"bqbs%:߲} ahڅN&RKV-N^H8By1XK%Qɖ$Q('Tg:J[g<qZXutf&s:h(?y3E2)K%Di%}It4!#Nf}+ D'd>ql_ɹCl0cWl rҒÔc}_b$KH֤R*2I.Sw|&`Vg.Hw7N,F՚,H 1(qج*%1'IxwnuB'Ɍ2GB*X!lI=eP92uD"I%| 8QWi*HH\Ѓ:T}nC[zь˸uGUx5y rj!8]O~~8𗻎2:qpRG@ #'PվܝXqƈd'cӵGd`%3fU)DGĜ=*cty^%~脋+la4YHp;ФiDRN/CVA '5 ~óc߻̦խj=>EC -N?FCz,': x(kq0'j jA0hCĚ@[ ~ nJ{[ AmE83L?( y#=E#N{>Id b SF*MćjsR0 [Y mm`4lZ YU3}"'ѸD2"I]`47GɓzwF,q/s&g"fNQPU=PtӾGsk2k3(H,\l,iNǥ.D<%kRP2eKH8'ňHf"Sr㾟j5')ڏNZFv@̄Vb6%73bdˮݫ ^k:S˨Zby %Du:  I"Dv䚾%=9s[,QR DإW@i8?+i#sseImx{\ KTa!ߩCQsl!%K 1PRmyt@B0W9]u& >~+ :f]ӂc,иjw``QkK6:Hʳ6ss*r&ހݷb@2iUd4#,7FW`L%YI]f*(qq%nTRɚ8RI|LHI/xo%Y.r08SEFG>$v*+s'Y0y 14R)Røpf\^8Qu兗(TXxʆmwXbkoU8x/9`ȥ mlX~& !k9BpэR\ P)1 OW HeyÙۿխ t5r%kDUQs1o4*g@N^l"łZV.dlBtʕp1QrF:j6a)k1DnwMXQS0Ʒ%z_UPdT(M2G gz׹s.}K X6(v;K,85KJx+shFC-"2O)9reyGl浝=D ,' rS˲򉤗Pc f*}*["FÒ(`ge|hGf Yša7qL&tY;sLcLm9::UDdj hPUޥ霵"c%%ek<~"n^UMGVj=9rA껩x æ#kTND%bdʈhJ~qPK+% /z^: ?ƼVS2sRHxt}ONjgvϕzzrn߹ʐȿ-laz:=h{M C%[+!7t}= ɸ߳ܗPL< ڵWGgV/l~~W,< [GJAӷ /?.zO{OקgGNV{n뗨u{^wd׻~뗫0~QRy#eٴQ޼RcD7^_Ǧ{|hϣ=ӗ:ݣ(K>OEa0ϣ]_}˹[dt_]_ܜ]ݘį Kуǜ~E7?ď oWwn~:SϺzv(K+pg_CՇ`8Yy޾t}vus@?,nZ-0zr[pbovsy!wڜaǻ߃/wow:sv;Fa$o˻˫%DtN &nƝ{G/;`G}FϺvQ‹~O~n`|}1?]Xۻu[ɣA$uP;a?C׋JHwtГ)A+ GA.筨$O*RȂޟGanҙKF ӌq0hb&u_)iWIH`e`~Yrfl%_^Np=Wmy񣞤'eK/No ܢ^{/wP`t)~ZB4&җޫgF 'gvW]\]$g> A\ }UzȤIu?T;} z`p:|k?nH,+eRc>i?UEpBo~yX#Q&O0z8C;nv$vڪZ$ ݳjYXJy27#s3ːQiGWՉK9RS\6 ԷzoнVW]y|GVO?9who#$ep}lk돍QMؓޖTx $57/\;Tc]zޕFr#R d0xDkk xc{idfvkU%%t$a TT<냑\:Yc]'r|/ESP<:˵ȅچuQpb$7By׋f~H?|nwidN7\5h4,A~<=:Q(kTlxz\5u|'7YB÷vWK .q~uxKOBs?-q 3!Fu qA <>v{v jA ",r{҅-iL?Fl)BZbFȪ\**U6EB*6A%V0wMٞa$"flV`lޟ ˆ=@lrN9yL%Y,cs %4M\ct,[7:3dh_ߞ RdkPqTRJsoVSoT^{"d)h,*1n3BMعeh3d՚nh`c?JمWݨ_ 7˳QBCv}茆C0\;dQ7v}VOCoDS+1ɍVlF4b?fFT9A̐ +fbB!lʵҗb&z2}ib2BχԳ2&}ŸWboW4QpkP=.hr6=׋713c֯i U^p虍ߵfC|7 p1QRf(6G*a¢j<@4TQbNK߬}IM+1&nD(z Dcq0IlIMS0T^R03/%c:8`r81e+c67)L'E[)94\')P׍LhܺWeĜטDyAډvƁ423V!Vei R)]Itmx Vɕ8r h2[߯ұBD5W(EߏPVՅ@MUY]!4%%.FKÄ.BF,pY!) :E r?KC[ɀIP1I\kWr{Hv %7$Ryj.V#Wҽ;ylSZV&ⴚs)s;ֻ:[fvVˁKWͼ/U]oն}Z-I6qMoΪ~ccBQA(AjtB;\d[3* Z>8xɦw3W]{lV܄+V'ɇqF܌O;;1!gNdXԐP۬.-J^Jn){nMM[! ьl)fƚ;,>O{ࣅB'6 zSzC&5 ?bԜmްr]|Yhdf_\V [Urò)̀銓s\a/ᬮ7~v\f.MSb=B1' ;2DNJz M{ԣA[y,q٨hs<ِB] ^tlj\ȩLA-b#DN1O D$T_bϣTR``T==9Nt7c"AΉK3p9YIAw3a;.(!yEt I}8)-p"|RY׋xwtW/2!1 V-vSuɋw_{:u}qqZ ىn_TKUuQzesfwUsTs}X_pLXYbd)kכ74_.={OCrxs{x l]1s218[]U ?oNC-fQY<8ºN(Mx!9cIME6#%⣡C($k{"AϺ_Vo¯^OY?' wo UEyلgX<:)\]Y#}Eolbc_a(#a^c) XSsEkBN$ҝȡtnZ9oRq4-K\K c<~ H#,]'Gt8g^ AAȋ_!ԾP2)3A>֏ZFVA\ b6%d\)k X2\i6["Q161R0fZeD, $DzT[:EH{9I}*DG?]^ywB~7&ZQ(ff^RlSa {|p!v2Wzb!R++sl-E3gT^F36q44'PCVOG938ĺnWx+렋'?P{چ< -[4,nYcr|wh?C!~he&j@1m >$v6ghSnjfʖ.C[Ӣ>my wg.Nvr{'ƋrQ5of΅[~9]}Z'nޛx\?v#bY6ndq,60a)ݎvUl9N+wŋL19i Y9>i3w6tg-^̶ǭiKq6`=n22<|v8Sx Vvak\)=7@di\yPր).CGBe%֠D ilc;Ϩ`-W(f']ѵG;gAQQAp?(,l9[6oQ%VGa2Sdq"&ļM% e\gfNTT(-1nP\~5q;4;F7ewdpir/J.T+UܣJ\95BP](kpFTGd={[EJkiӚt %yšp Z+땋$-+%ɩ-H\)&.oJ˳zt-Ӯ4&TW<+zmR-Ono);y5 V8/gغyɺmnl⛐ut]=QǙ>H;2}@nW"8gX\*iy$TOiR6ɞ,N!SU>l(/J <لqhUYOtWۻ;v,~L3dZOћ)x5 h6;QhGdӎE\ڗ]mz:m>[YVOR854UGB-YAgJ+m:Y?s@Ǖ,1K̅7gIџ-Ak_Ur[- zm﶐!5{䳌qe򐳞 `+`@FrUT ˉ ]4$K}vg&'b7`w3 aK;{.gGh\=BFO0(B*Q 3y \fҡ2nV~KŬV\xOB;6J5vK)Pˠ7J9>9^9$Ł1nk@UzߑAɦ?ZX`2^"MR? J!i4vOvlCzk+cu_W@su_}ԓ]qe_q 8[7撇0fN#nǢ[X>Yϓc-,x& 7zI}yɲ텚4).58379#ԑXˆ:>Z]k_dr[.Z-Ezx5L|y)=)MIpOM djudI;@s=xwBV$9Kn<$Xή~UWvA)ݽ|s 3fv3øf1y.øN' |Zx>L֚gJRR~Ul$+-3{2)S%' w#J`52WNWѥQӽn_'! 1%PbϽS35MxP)5 yE)ٻF$W 22#/zrz{f2WYOdJ<,fmZb)+#88mLS61T|<ʥkv(ev7џwK+,_3ą]y3[;L킍M{|=˗L}IEƪ8zaZ`f89}huOk+8f}OYTR% W!G]Q|/Zr;9A|JTy(0e!{(3%Ue};m]vw7 fWY~iopE=;,T$՟ |tϞQlp?(3>+u@&U+˭[R({?ez>?ϧjfE7y|!lGIb/o .[25^xr/_[5 (sO\ވAdA0DDA w 9F t(}\xa"Mh-eH?rp:6UdpРu3S}4 1yYXE!/T#BU1CCk K/Hw}ڂ-پ*^a{彊 WWv^1h\[-X ! Ee {e:e\/<J@Gg#2ɜ(&UTNOtBLb4Vv@ d"+0e (M JC~ Uk!,)i^1\Foe) %v/ OIy\x! #(9Dz8Iz *A?iE[ WqtYVrL+^\r]O] <-$J[6ܶw+ 9)N ȣqE.I<2#S1^F5SUsIOU% b~8mkϞ"gZQ8E':?1x)#^UN3CM`A/ko8 rJ$͊Z ^ȘNd!v-FJ@Bx+A{[-(vs7{ PcCrrt.Qr:P[FU݊l{.!(gZhau`7{w]_3̔?a/L >EIkhvBvYd3z(3zRV!@vFO.5z0{;'c\%Oo ){&ֆc]7R!+ޖ,G/2M #W_%U]`K:*LEQh-َ; FK *k cbAZQDb&yq)y4Ѕ G~glru%tb\$!i DiN[d J@5͙:Op(;Qv3ld+%gP)"Dj ESȟ<+(:4LT&SukKѨ8N6%%S/)U} EsPIPXo硃) Vx ( wx]I*NQYaL7]M.Rcigk;UP!n‰jnKz}èZoPC.+{ٵ 4 t}{3^!_T\Ý#_F帚 果ɹc<'wxO]PJy9ܝCFhkΦy/Uoz83 }B46qсP^ ,'j!̪ӆ}X1TL 842J*/hPY!s(t᯴B]Z[2CBP+s2չrչdŒ*y)Kiu.Dm(o? ƔV1Y}X2 nYߪ:Lpj.43fj3/Vzo;)R+Cw/j]N2ߘǹIk&qˆ}$:WX\ +VhD:]!>'/7dX nh#$.# @nӁ {vy#jievie\[=\ Z%b9ht7K߁PlKfp?#Y([ە5VʥAnſg3qAp[/;d8{a6h[Wv\Oȭ1__My"VYf7*qp@zfD9'NӈHOj1m)#"t%=ND6#cUZt K#K1_yo-] p+q*ckhSms6G޻fYvx>g`m!m-l*5, @Yo|f4pgL8>/>]=2~c|7wWd,rۘmgj%-9-4L9iQZI֖wxmiC%}TOُ{P)vjrPٕ۹L EAJ6Rm7Ox glQkr)m5:Tu-rZ;F]{Z w5 MW mN]y:A7΂|?pLL:f$< ^!e~rK?ݘc3svqc<3k$gv3Ҵ [e^"lcBncpi`\%|zy}G돲e?jz~|;AiF%ZrWB` 0p>0VD^zȠdPO5ۺi|{.=d_c9A/'L/7z}Ip8'y1 uWYCR޳SM2I=!XtWUua8͂|1c\or_–\ FR׌M͍K" RDI6xˬ( tx`CTZ٬SRmۛ)"iΐ@yg0N şşwvz;e2gDK[k s%$gU/qKH[-=dg1sqErU#4AFk!'Ǣ\{#HЊɰDżXmS0APTu1n2lѹD|PQYj!IM֚S- =wqBCC`!y 2g\f50`NҜV_ |%V[EQ+k"QK<&DⅈFۼR"^_$}'=$T *24<)\g⾨]ЉÌ&z!^Eʠ$M~M Si釶 [A.&h.OD3C%amyyuLs}jJbjYڼOVx')WGߟt\ev*}9\d:kP w# :zV%T[*@9ak_4nfJ^̮|!X4N"Ł Ljڧ»"w<,`1pSjhwSV(%OS~g+t5ԩ rAr~)NWld+~y1CΗOesL/;^.R륛ކ\B8yRm.SݧtA UBa^UE犹X{Pj|9Yl+S1j\}ЍV)ɮPʬ]|Phk+ 4q8p{nQt{B[t?WMOH捿BۢJ䯍W _OY.W(ub6z?WMv8ߓp~ do+o*xE `ԜtV:  qd9h0Ǫ<_in>IE_[h<뿢xR)ējLLZK֥#rGbବd$OZ:P{iPΗݯӢw¥VY4F ΢#YHQMJ6iA'S*d MTK;Gqޟv͈2L8@xL3(fi,ᘧ܉ULIwbq2NLoQE*7h31&jn3%JQ(O6$Eڊ( HIrWM^7y%|\dFQ2DYd>Rm":W3d]"ByEWS\;PUrka E׌T"kxz$ˣ-(.9,Sgn %48*0R*:Q ͼFMxܢ5)"ml, S 'Ez7Yx;Y1ki(o+ F0#x} <2C7ws9&/&Ro02m$a֗j.5WQ}-M0vh5p,O:M?o˭(UQ?3ЕR}$ih_o2Z 0& ~:Pxxt5@K :Nhr_V2;Q=\*:,9Nn_["_YմA.n2y(?ˉZȢw?1FzV ю1rvptd7OwzYޙMj'*4c16~ CU2Zi0H0hW[::6 ٵm`ӻ1%z__oNA)*MyOG䯎7}y"{bݦC>5EOZrFW=_/sܐtzsA5N:jxmE=:(bnuzO׽O{z{.[%g?W`笏o؂#pqt| DŽR)Hqq:SjAݱ'KPO1FK .4͜)Wx"i#iMgndA\ G v2wo:i! iY+zWKSZrCG0DBTex$a-0lD nL"m$:cHAׂo˘PlQFc͙d2؁I/a Vuhu#`X\QXq/ g,J* z:$$&eF[ -G6$QTNnG7o-(QD w:$,%A!j">z$wKM2nPGuU8fח`+sYSՉ!sҤ7fcsᨴBkX hQ}*qK '0xtTXTR5qxJ+$ KEo#qcC-ėfE;Z8d)I"BxԈ?T% .& a.[B`1*A!?jhQ͑j!Qc]@!0hNz&@/ik>E QE7ԗ!(j=^b+3O_=!^,.{ſݭӱϓwlA6<7R&Q>}mq`\$g;x}y^W=X7yyߔPWdb&I 4y}Z!.Ϣh]ܡqO)fHF_+ج F]Nl>?DHj!}r w~Cs t5G>$;g-vr!g//o e#B!FܲM>ZHS('?;]nd>M>}yݰ?uq22{>^}MړKy?n`Žh8Ťq8Fx$(Sc2jN.bk?UvNesz݇5 FWx)fϋ-d_.gw!{52 Wcze#F6J$p06镗7?qB!5kct$ѻ]jJx Tm۳QjzzQs/i*L Orv5\ܦ/<f*q ?Ua4ZeQGm;捓YV4@Ԇ;m>4$Z )Q&AAا}q1/ʚ|K2hƬ7UKI:lrQ|_jZZ8NsmU.rVK/Z&b*@vܘT=vhr'qTj)z|6xg#fhbofby}7ǛF}W qVL'+O_ Kf]e獀lƛQS ?_ \X3 F ݔ\__S@NVFP7tY d:W,v"Ƚj,}Ӈѧ+#uֺ0RAkdN?6n-yF1:st81Z{K5KFiD&X'*z?>qpN6dAM|MR)ڻOs̠z+rfSglLV弝u# 5?]ݱ( 5_7B:px](OYu.DsK (1t$K.>u2Ջ/XЎ{V,D %)BOXwғ(eK{ 8D4zbbk\qh5EhZZ)bMTr-2?BϻގqWZmD˄OB)3-  R1WOq&SR{TAY.!i׍MhZŋɊdzֆˍ*<{!wg_o{/hًSԥ8hj,!c9s 1ȳcAz ]W`u;t}fk[xW{.ABT@t3ʄ[ 6[- |*QtTZbαD %P ޷(NGesTK$ /@4R!G8D tMPD5gcDxFE"λ:_~~i?v!mJ^ O߀%ӰpAD5## PCRn4{R?xe 7_0jv0 c$q#ؒut-K;@wcm7a.%P-y|+ CqH駘2)&Ґ%;7ўo}ipIMG,WׇW*Nݼ :v;9L\'3jzd*xtp.ndF՝G_oJ:v7-`aR+Y/(`c>xNE]::8DBTex$a-0c"тHr 4FS=i%40׾&ޔww-fK#PrSuf0@̠=O5NX%ٕt ZëHYYxxH~_3:Ui*lj寻7D ,?}73ˉQ).=LK,:Ϲ]Mu2-9g5,Siw_Y еӡo, "\毗_ɿ-ɚѐ/eyJ@D cxIYMY4ZO7RFYJՄi~GJ6Y سڼ@{2S:\ivgͯ+ _Ϛ^~nq_Yvi=Db]g*Y#}pDf ЗE)&>ԓfIeɉd(|пKƠ6:xk ,756jOXE o!1!*ZZG˼wFZ8J ~9)w}dQ2gwQwWҖSY@Tz-IM).&I} Llb(bȨ'UP) eBz\8R1PaXDf S` P$BJKa]te&C(Ƣf~URFײ1zU'npufghI5b.Ps8Wt|) Wזw<ĈkG-4|N`<"v3 #~Z{""5P.RY5x; t&T冡KX6g, #-FLt/s4^.2K£8HV$HGB=jéi{'n$Te}}%Xg1eV?Djml- )}S;6_00U<&Y#̅N(Vc/k05j0`ոFSI(9Z@0Ϊ *0j!bAJ~̻v*ۚiqy~T\YCbfNye!--ǔw&%! g=2D@c9jd幉T4N$G1iRM/MBggK.@*3'0Z7[q(rGcgmn!CZ2VjT P^% R5FE0ee:&ePՍ=X[+v71(.1IX++V=04"ס;}(\~Ao`C >On1E ԉx?"s[qHɘj?ehY P[%hf{Ǻu~T9CFOSJ(vÙ}3T3}*ԡt9i;JNlNV3l٠"ЇV8_co@D ~_?k`yEڕ^^ z'Kx^K xoSrv#vwwr-wn1$-:}8_>ƴ sk,K1~}~0?p`EwkSN1 A$4sr42YiiY]jU ,хpY JIj2+4G0(Wʔ|ـ2I:F.B)ѪB\Hxrj#L?픏hɱ*xVHԱyo1Ԩu70Y!FWeEY-[dhLoHդ9 jOB2ħؑ1m'& '6 >wαFV[j5WʸPL-Rp7 ײha>" q슼ٿK99yw ,-LF9eb(lZN>ޭF]U8kLGR.Na,;0N^A06?$+k'-eƿZ v:L1-:) b?2%x^rbx+RBW '2H ΂ƛ6@ DxN&+0Ђт%%XsI31SthKk-r~U@c_D`.2L`ƃSZ'h`UMTtZlR0f5yO( z$`HmEѦЇ*;TNga;agUnT,dO0"l.QgL"8 j ̠="YC5cqZꮳ061}kM= [e-O"C!k'Vm=E[c7%d՜5d v=1TYNA'aFbw7FQ 6ZK EDXݶ]\_t<7}Df3*.)q5xGnH97[G7zpi|~TTp=AK"h( !.9IYO鮦:/9+ +*9ΠRNPuw5zDW.IFc9 m!ʼ]`al<~giCn91OnR$sA1:Rf׋B q-غ]Tά<%CNQ OOT3Ǭ?Mꦩ;6Gb6ig-apR]l~zt+F/&,ʹVN 1܆r cІFS4v<ݑfq>M+F=&&,9e:w8Rڼù)~K"b[*H k[~ p{ \OӡRÆiEg%Nb;uT^1U5X]G8{S ݇rSRݚs[WԸNwnG̻1܎ݺ\VF><䕻h'jPj.,y4tuPh获>opcuQKC׃r)]SzqHQFM ;0{uƒ2GkHBw5 dXrpbיZZa)*3XbD QGκVPˌ p|Z|h"WV[3s L$4FCpq{ FO4UtVS̼_o_\_&8*b?.3\`Ҍ}$! >1NeL ڸb8/fLQn^~I\E*<It|%oAnSg_yBۮ{~Sٟ/Ou]{_ZfE:[~x8gf~:e\O(Lו%KwuvgOX?.o%YEhrAɀ)꣒Իx e^;&v9<_ʝpQ9$U5-on8Zٰnwj{-_.KCDٽAaS+gJVgJQ!QKX&HRͩ:) 938PqvL.[SGEJ(a8E"p'x w~m>TQJ "i֚[!bt׸QFQ?,S`g5d3\11..>M7Z!+M(p_n{8DQUYC9+o"yC*3E a2zga#,b bQ .qdXjXY1Ld8)b©J y(mF^-/\wċ]u!CKU pД/U,ZECPU ZۤAh4'ov1N˨ q(d 2'{q ͅa+0yĤ5 S(7X@_|i+3P`NKYF0(3g>1iX9Rp@!ξfD-Mj~̻D#щR((|Bh 1R*^"Pě"&>48ZBb~q#.13$]ZD*ěqKڎ#kDusƚǙ!)'$~v }Jj$љ_E^(&c27^9 g_..(UҫO .%26sNgV|=c D(# UGЄpO!Mn5F=Tᵮ}«$'_``% HI2F* #\|T߰+@tDj4wN m:m0GedL;uXJ*P{˗+W^e'd z6ADm9IJHH mt*DDz/$5xDաKa{"!cJ(y4ڒ}2XW=F',ɝ.'D/))E:QdSw 8khkBoi]kpEtJ`%)C_0$ӊ ްlK1 ǡ| NAbEYɔ>aPv4Ǥ+w4꠿YfM#CKG;&&p䠖vIc&zK5MeMZ+C:;4ÿYkY@`iݗѥA-#K. + jB]+o0/wP&eDۯ^y]ŢynO2}PUu{x9SNIGbLd(8 Nm+qlP7 Ŧktևh ~l3wfDF8^EIPQj_a3F]pl`譑!DY92E˅U,͘KzhBcL9+ JXfʽ2*Eu/h]jb޼9KޏB#bY.onO'Ն9J3͡u~Mg~LˏE(u2їbC,1= t &ςHB;rћҦU-6\RZNzJE B.tԣXQz(& Qj픑 a2h ,ˉFTbЙYjx(_/"MJ)ɬ_R%1g.Q=IVF<ޭ%ggwMŻo plsecc[bK-XԂPt)Խf&*u>@tR [RP"$&&[dž|/zd$(L "KRrϛ`1)5bRs1H8%5\e :@:4Q ;V UTv{q@tto7 pJgc/kō;akΙxVto.\1k$:vYaaWffԍ`o!16Y=例]k $tc6r[Yned/jl"S?<6GUӇ; RRqҹBV{JEqwMtԣԼgCDi\A\4;JI=JR~҃FiR" Q JI=J-G遣"3>3>3aK'=Jq 11677ms (00:06:02.287) Feb 17 00:06:02 crc kubenswrapper[4791]: Trace[675468661]: [11.677164494s] [11.677164494s] END Feb 17 00:06:02 crc kubenswrapper[4791]: I0217 00:06:02.287266 4791 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 17 00:06:02 crc kubenswrapper[4791]: I0217 00:06:02.290092 4791 trace.go:236] Trace[1113905256]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Feb-2026 00:05:47.279) (total time: 15010ms): Feb 17 00:06:02 crc kubenswrapper[4791]: Trace[1113905256]: ---"Objects listed" error: 15010ms (00:06:02.289) Feb 17 00:06:02 crc kubenswrapper[4791]: Trace[1113905256]: [15.010320493s] [15.010320493s] END Feb 17 00:06:02 crc kubenswrapper[4791]: I0217 00:06:02.290176 4791 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 17 00:06:02 crc kubenswrapper[4791]: E0217 00:06:02.290940 4791 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Feb 17 00:06:02 crc kubenswrapper[4791]: I0217 00:06:02.291207 4791 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 17 00:06:02 crc kubenswrapper[4791]: I0217 00:06:02.298630 4791 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 17 00:06:02 crc kubenswrapper[4791]: I0217 00:06:02.323584 4791 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:57938->192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 17 00:06:02 crc kubenswrapper[4791]: I0217 00:06:02.323682 4791 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:57938->192.168.126.11:17697: read: connection reset by peer" Feb 17 00:06:02 crc kubenswrapper[4791]: I0217 00:06:02.368788 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 17 00:06:02 crc kubenswrapper[4791]: I0217 00:06:02.371696 4791 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c" exitCode=255 Feb 17 00:06:02 crc kubenswrapper[4791]: I0217 00:06:02.371750 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c"} Feb 17 00:06:02 crc kubenswrapper[4791]: I0217 00:06:02.408202 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:06:02 crc kubenswrapper[4791]: I0217 00:06:02.427220 4791 scope.go:117] "RemoveContainer" containerID="f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c" Feb 17 00:06:02 crc kubenswrapper[4791]: I0217 00:06:02.490773 4791 csr.go:261] certificate signing request csr-lxl7v is approved, waiting to be issued Feb 17 00:06:02 crc kubenswrapper[4791]: I0217 00:06:02.529938 4791 csr.go:257] certificate signing request csr-lxl7v is issued Feb 17 00:06:02 crc kubenswrapper[4791]: I0217 00:06:02.998039 4791 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 17 00:06:02 crc kubenswrapper[4791]: W0217 00:06:02.998205 4791 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.CSIDriver ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 17 00:06:02 crc kubenswrapper[4791]: W0217 00:06:02.998233 4791 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Node ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 17 00:06:02 crc kubenswrapper[4791]: W0217 00:06:02.998243 4791 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Service ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.076618 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.090322 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.127967 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.136902 4791 apiserver.go:52] "Watching apiserver" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.141798 4791 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.142152 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-kube-apiserver/kube-apiserver-crc","openshift-machine-config-operator/machine-config-daemon-9klkw","openshift-multus/multus-299s7","openshift-multus/multus-additional-cni-plugins-8stwf","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-dns/node-resolver-dl4gt","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-target-xd92c"] Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.142532 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.144522 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-dl4gt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.144579 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.144609 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.144907 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.145083 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-8stwf" Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.145229 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.145239 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.145254 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.145282 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.145370 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.145709 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.145798 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.148299 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.148843 4791 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.150291 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.152500 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.154141 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.160017 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 18:23:01.931267565 +0000 UTC Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.161966 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.162026 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.162044 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.166918 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.167147 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.167025 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.167651 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.167662 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.167859 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.169993 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.170024 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.170069 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.170988 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.174169 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.174203 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.176064 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.176266 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.177687 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.177920 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.177917 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.178477 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.196682 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.196725 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.196748 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.196792 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.196818 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.196837 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.196858 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.196877 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.196897 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.196918 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.196940 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.196964 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.196987 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197011 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197020 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197033 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197092 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197131 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197149 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197170 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197173 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197188 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197249 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197277 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197301 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197323 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197348 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197371 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197397 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197422 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197448 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197473 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197497 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197547 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197573 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197594 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197617 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197639 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197666 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197723 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197747 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197769 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197791 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197813 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197836 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197857 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197880 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197902 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197947 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197971 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197994 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198018 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198039 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198060 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198081 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198107 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198148 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198170 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198190 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198240 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198263 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198287 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198310 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198331 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198352 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197421 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198377 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197580 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198385 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198401 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197631 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197676 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197812 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197870 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197902 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197928 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197994 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198146 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198165 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198234 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198279 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198341 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198349 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198393 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198512 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198541 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198571 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198594 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198644 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198668 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198691 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198714 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198738 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198761 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198783 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198805 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198836 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198858 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198878 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198899 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198922 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198944 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198965 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198988 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199010 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199033 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199059 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199083 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199126 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199151 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199172 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199197 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199220 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199245 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199269 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199289 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199310 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199331 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199354 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199374 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199423 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199450 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199472 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199494 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199516 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199536 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199579 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199601 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199623 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199645 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199666 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199686 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199707 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199730 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199751 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199773 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199797 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199818 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199840 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199863 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199887 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199908 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199929 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199950 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199971 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199990 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200010 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200028 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200050 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200070 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200089 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200255 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200282 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200305 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200326 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200347 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200368 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200398 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200419 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200440 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200460 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200481 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200505 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200526 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200549 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200570 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200592 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200616 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200639 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200664 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200686 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200707 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200728 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200749 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200774 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200815 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200838 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200860 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200884 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200906 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200927 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200948 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200970 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200991 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201012 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201036 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201058 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201079 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201103 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201147 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201170 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201192 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201219 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201243 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201265 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201288 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201313 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201334 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201364 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201387 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201410 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201435 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201462 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201486 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201510 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201534 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201559 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201583 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201607 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201631 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201655 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201681 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201705 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201728 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201788 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201819 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-multus-cni-dir\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201842 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-host-var-lib-kubelet\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201867 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-etc-kubernetes\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201889 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1b819236-9682-4ef9-8653-516f45335793-hosts-file\") pod \"node-resolver-dl4gt\" (UID: \"1b819236-9682-4ef9-8653-516f45335793\") " pod="openshift-dns/node-resolver-dl4gt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201919 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201944 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/02a3a228-86d6-4d54-ad63-0d36c9d59af5-proxy-tls\") pod \"machine-config-daemon-9klkw\" (UID: \"02a3a228-86d6-4d54-ad63-0d36c9d59af5\") " pod="openshift-machine-config-operator/machine-config-daemon-9klkw" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201968 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/02a3a228-86d6-4d54-ad63-0d36c9d59af5-mcd-auth-proxy-config\") pod \"machine-config-daemon-9klkw\" (UID: \"02a3a228-86d6-4d54-ad63-0d36c9d59af5\") " pod="openshift-machine-config-operator/machine-config-daemon-9klkw" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201991 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-multus-socket-dir-parent\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202014 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rswnq\" (UniqueName: \"kubernetes.io/projected/1104c109-74aa-4fc4-8a1b-914a0d5803a4-kube-api-access-rswnq\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202043 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202069 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202092 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-host-run-k8s-cni-cncf-io\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202136 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202166 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202189 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202212 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-host-run-netns\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202232 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-multus-conf-dir\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202285 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202310 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1104c109-74aa-4fc4-8a1b-914a0d5803a4-cni-binary-copy\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202333 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-host-run-multus-certs\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202355 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4cf8\" (UniqueName: \"kubernetes.io/projected/1b819236-9682-4ef9-8653-516f45335793-kube-api-access-l4cf8\") pod \"node-resolver-dl4gt\" (UID: \"1b819236-9682-4ef9-8653-516f45335793\") " pod="openshift-dns/node-resolver-dl4gt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202379 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/eab5901c-ba92-4f20-9960-ac7cfd67b25a-os-release\") pod \"multus-additional-cni-plugins-8stwf\" (UID: \"eab5901c-ba92-4f20-9960-ac7cfd67b25a\") " pod="openshift-multus/multus-additional-cni-plugins-8stwf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202411 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/eab5901c-ba92-4f20-9960-ac7cfd67b25a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8stwf\" (UID: \"eab5901c-ba92-4f20-9960-ac7cfd67b25a\") " pod="openshift-multus/multus-additional-cni-plugins-8stwf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202432 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nwqf\" (UniqueName: \"kubernetes.io/projected/eab5901c-ba92-4f20-9960-ac7cfd67b25a-kube-api-access-5nwqf\") pod \"multus-additional-cni-plugins-8stwf\" (UID: \"eab5901c-ba92-4f20-9960-ac7cfd67b25a\") " pod="openshift-multus/multus-additional-cni-plugins-8stwf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202453 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-system-cni-dir\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202474 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-os-release\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202498 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-host-var-lib-cni-bin\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202524 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/eab5901c-ba92-4f20-9960-ac7cfd67b25a-cni-binary-copy\") pod \"multus-additional-cni-plugins-8stwf\" (UID: \"eab5901c-ba92-4f20-9960-ac7cfd67b25a\") " pod="openshift-multus/multus-additional-cni-plugins-8stwf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202550 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-host-var-lib-cni-multus\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202577 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202600 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/02a3a228-86d6-4d54-ad63-0d36c9d59af5-rootfs\") pod \"machine-config-daemon-9klkw\" (UID: \"02a3a228-86d6-4d54-ad63-0d36c9d59af5\") " pod="openshift-machine-config-operator/machine-config-daemon-9klkw" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202625 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1104c109-74aa-4fc4-8a1b-914a0d5803a4-multus-daemon-config\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202650 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/eab5901c-ba92-4f20-9960-ac7cfd67b25a-system-cni-dir\") pod \"multus-additional-cni-plugins-8stwf\" (UID: \"eab5901c-ba92-4f20-9960-ac7cfd67b25a\") " pod="openshift-multus/multus-additional-cni-plugins-8stwf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202675 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/eab5901c-ba92-4f20-9960-ac7cfd67b25a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8stwf\" (UID: \"eab5901c-ba92-4f20-9960-ac7cfd67b25a\") " pod="openshift-multus/multus-additional-cni-plugins-8stwf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202698 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-cnibin\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202726 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202752 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202776 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/eab5901c-ba92-4f20-9960-ac7cfd67b25a-cnibin\") pod \"multus-additional-cni-plugins-8stwf\" (UID: \"eab5901c-ba92-4f20-9960-ac7cfd67b25a\") " pod="openshift-multus/multus-additional-cni-plugins-8stwf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202804 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202831 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202857 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202881 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6rgm\" (UniqueName: \"kubernetes.io/projected/02a3a228-86d6-4d54-ad63-0d36c9d59af5-kube-api-access-c6rgm\") pod \"machine-config-daemon-9klkw\" (UID: \"02a3a228-86d6-4d54-ad63-0d36c9d59af5\") " pod="openshift-machine-config-operator/machine-config-daemon-9klkw" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202903 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-hostroot\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202957 4791 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202973 4791 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202989 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203005 4791 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203023 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203036 4791 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203049 4791 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203062 4791 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203075 4791 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203087 4791 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203100 4791 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203134 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203148 4791 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203160 4791 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203174 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203188 4791 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203202 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203218 4791 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203232 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198439 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198576 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198709 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198707 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198767 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198800 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198820 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198892 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198910 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199018 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199040 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199046 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199190 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199209 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199367 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199382 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199530 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199734 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200317 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200579 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201408 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201465 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201626 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201749 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201878 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201927 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202201 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202481 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202499 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202716 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202780 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202922 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203136 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203234 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203380 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203529 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203572 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203591 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203794 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203793 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203812 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203834 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203949 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.204003 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.204206 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.204271 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.204291 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.204496 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.204634 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.204643 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.205003 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.205138 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.205457 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.205455 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.205488 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.205598 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.205719 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.205709 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.205747 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.205851 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.205921 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.206040 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.206080 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.206196 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.206250 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.206324 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.206376 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.206437 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.206547 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.206764 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.206884 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.207093 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.207964 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.208632 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.208704 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:06:03.708687743 +0000 UTC m=+21.188200270 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.209757 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.210875 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.211108 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.211340 4791 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.211353 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.211445 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.211581 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.211781 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.212235 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.212490 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.212556 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.212638 4791 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.211628 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.214537 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.214902 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.215097 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.215134 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.215282 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.216395 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.216873 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.217937 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.218254 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.218334 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.218688 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.219550 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.220030 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.220305 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.220379 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.220494 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.220755 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.220831 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.221019 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.222629 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.223009 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.223294 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.223605 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.223672 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.224339 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.224670 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.225028 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.225190 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.225453 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.225520 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.225890 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.226251 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.227249 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.228424 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.228731 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.228990 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.229833 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.232162 4791 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.232409 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.232574 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.232746 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.233027 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.233409 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.233692 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.234006 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.234448 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.234679 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.234783 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.235090 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.235225 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.235404 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.235466 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.235826 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.235915 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:03.735891594 +0000 UTC m=+21.215404121 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.236267 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.236431 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.236849 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.236864 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.237653 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.237960 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.238132 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.238156 4791 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.238286 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.238448 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.238722 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.238740 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.239028 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.240021 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.240280 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.240493 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.241437 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.241757 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.241822 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:03.741801681 +0000 UTC m=+21.221314208 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.242162 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:03.742107111 +0000 UTC m=+21.221619638 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.242298 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.242791 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.243025 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.250637 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.250706 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.250732 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.250749 4791 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.251038 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.251088 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:03.751057894 +0000 UTC m=+21.230570411 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.251222 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.251595 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.252028 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.259850 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.260079 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.260647 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.260780 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.261129 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.262203 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.262430 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.262609 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.262752 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.263024 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.263556 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.263622 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.264071 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.264110 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.264829 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.265099 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.265317 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.266832 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.267794 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.268394 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.269914 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.270892 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.272513 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.273920 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.274189 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.274515 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.275342 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.276512 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.277308 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.278268 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.279541 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.280368 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.281960 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.283351 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.283899 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.284411 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.284474 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.284502 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.284949 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.285262 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.285938 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.286973 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.287087 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.287525 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.287994 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.294099 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.296261 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304392 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/eab5901c-ba92-4f20-9960-ac7cfd67b25a-cnibin\") pod \"multus-additional-cni-plugins-8stwf\" (UID: \"eab5901c-ba92-4f20-9960-ac7cfd67b25a\") " pod="openshift-multus/multus-additional-cni-plugins-8stwf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304455 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6rgm\" (UniqueName: \"kubernetes.io/projected/02a3a228-86d6-4d54-ad63-0d36c9d59af5-kube-api-access-c6rgm\") pod \"machine-config-daemon-9klkw\" (UID: \"02a3a228-86d6-4d54-ad63-0d36c9d59af5\") " pod="openshift-machine-config-operator/machine-config-daemon-9klkw" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304477 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-hostroot\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304510 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-multus-cni-dir\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304531 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-host-var-lib-kubelet\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304551 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-etc-kubernetes\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304570 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1b819236-9682-4ef9-8653-516f45335793-hosts-file\") pod \"node-resolver-dl4gt\" (UID: \"1b819236-9682-4ef9-8653-516f45335793\") " pod="openshift-dns/node-resolver-dl4gt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304602 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/02a3a228-86d6-4d54-ad63-0d36c9d59af5-proxy-tls\") pod \"machine-config-daemon-9klkw\" (UID: \"02a3a228-86d6-4d54-ad63-0d36c9d59af5\") " pod="openshift-machine-config-operator/machine-config-daemon-9klkw" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304622 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/02a3a228-86d6-4d54-ad63-0d36c9d59af5-mcd-auth-proxy-config\") pod \"machine-config-daemon-9klkw\" (UID: \"02a3a228-86d6-4d54-ad63-0d36c9d59af5\") " pod="openshift-machine-config-operator/machine-config-daemon-9klkw" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304643 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-multus-socket-dir-parent\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304664 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rswnq\" (UniqueName: \"kubernetes.io/projected/1104c109-74aa-4fc4-8a1b-914a0d5803a4-kube-api-access-rswnq\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304698 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-host-run-k8s-cni-cncf-io\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304716 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304737 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304759 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-host-run-netns\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304777 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-multus-conf-dir\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304797 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1104c109-74aa-4fc4-8a1b-914a0d5803a4-cni-binary-copy\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304817 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-host-run-multus-certs\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304836 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4cf8\" (UniqueName: \"kubernetes.io/projected/1b819236-9682-4ef9-8653-516f45335793-kube-api-access-l4cf8\") pod \"node-resolver-dl4gt\" (UID: \"1b819236-9682-4ef9-8653-516f45335793\") " pod="openshift-dns/node-resolver-dl4gt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304855 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/eab5901c-ba92-4f20-9960-ac7cfd67b25a-os-release\") pod \"multus-additional-cni-plugins-8stwf\" (UID: \"eab5901c-ba92-4f20-9960-ac7cfd67b25a\") " pod="openshift-multus/multus-additional-cni-plugins-8stwf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304887 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/eab5901c-ba92-4f20-9960-ac7cfd67b25a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8stwf\" (UID: \"eab5901c-ba92-4f20-9960-ac7cfd67b25a\") " pod="openshift-multus/multus-additional-cni-plugins-8stwf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304917 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nwqf\" (UniqueName: \"kubernetes.io/projected/eab5901c-ba92-4f20-9960-ac7cfd67b25a-kube-api-access-5nwqf\") pod \"multus-additional-cni-plugins-8stwf\" (UID: \"eab5901c-ba92-4f20-9960-ac7cfd67b25a\") " pod="openshift-multus/multus-additional-cni-plugins-8stwf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304937 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-system-cni-dir\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304956 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-os-release\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304979 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-host-var-lib-cni-bin\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305002 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/eab5901c-ba92-4f20-9960-ac7cfd67b25a-cni-binary-copy\") pod \"multus-additional-cni-plugins-8stwf\" (UID: \"eab5901c-ba92-4f20-9960-ac7cfd67b25a\") " pod="openshift-multus/multus-additional-cni-plugins-8stwf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305021 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-host-var-lib-cni-multus\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305052 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/02a3a228-86d6-4d54-ad63-0d36c9d59af5-rootfs\") pod \"machine-config-daemon-9klkw\" (UID: \"02a3a228-86d6-4d54-ad63-0d36c9d59af5\") " pod="openshift-machine-config-operator/machine-config-daemon-9klkw" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305072 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1104c109-74aa-4fc4-8a1b-914a0d5803a4-multus-daemon-config\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305095 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/eab5901c-ba92-4f20-9960-ac7cfd67b25a-system-cni-dir\") pod \"multus-additional-cni-plugins-8stwf\" (UID: \"eab5901c-ba92-4f20-9960-ac7cfd67b25a\") " pod="openshift-multus/multus-additional-cni-plugins-8stwf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305130 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/eab5901c-ba92-4f20-9960-ac7cfd67b25a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8stwf\" (UID: \"eab5901c-ba92-4f20-9960-ac7cfd67b25a\") " pod="openshift-multus/multus-additional-cni-plugins-8stwf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305152 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-cnibin\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305219 4791 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305233 4791 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305247 4791 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305260 4791 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305271 4791 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305283 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305295 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305309 4791 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305322 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305333 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305346 4791 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305358 4791 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305370 4791 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305382 4791 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305394 4791 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305406 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305418 4791 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305430 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305441 4791 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305454 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305466 4791 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305478 4791 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305490 4791 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305501 4791 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305512 4791 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305523 4791 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305536 4791 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305549 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305560 4791 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305572 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305584 4791 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305595 4791 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305606 4791 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305617 4791 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305629 4791 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305640 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305652 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305696 4791 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305708 4791 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305720 4791 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305731 4791 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305744 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305756 4791 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305767 4791 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305778 4791 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305790 4791 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305806 4791 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305818 4791 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305832 4791 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305844 4791 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305855 4791 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305868 4791 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305880 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305891 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305903 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305915 4791 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305927 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305938 4791 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305950 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305961 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305975 4791 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305987 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305998 4791 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306010 4791 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306021 4791 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306031 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306043 4791 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306054 4791 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306065 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306076 4791 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306089 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306100 4791 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306130 4791 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306143 4791 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306154 4791 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306166 4791 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306179 4791 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306191 4791 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306204 4791 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306218 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306230 4791 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306241 4791 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306257 4791 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306268 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306279 4791 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306290 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306302 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306313 4791 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306324 4791 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306335 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306348 4791 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306359 4791 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306370 4791 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306382 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306394 4791 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306405 4791 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306415 4791 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306427 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306439 4791 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306450 4791 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306462 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306473 4791 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306485 4791 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306496 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306508 4791 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306519 4791 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306530 4791 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306542 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306553 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306564 4791 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306576 4791 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306588 4791 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306600 4791 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306611 4791 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306622 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306633 4791 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306645 4791 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306656 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306667 4791 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306679 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306691 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306702 4791 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306713 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306724 4791 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306734 4791 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306745 4791 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306756 4791 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306767 4791 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306778 4791 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306789 4791 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306800 4791 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306811 4791 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306823 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306834 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306848 4791 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306859 4791 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306870 4791 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306882 4791 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306893 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306905 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306917 4791 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306928 4791 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306940 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306951 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306965 4791 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306976 4791 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306987 4791 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306998 4791 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307010 4791 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307021 4791 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307032 4791 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307043 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307055 4791 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307066 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307078 4791 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307089 4791 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307100 4791 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307126 4791 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307138 4791 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307149 4791 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307160 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307173 4791 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307184 4791 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307195 4791 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307207 4791 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307218 4791 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307232 4791 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307243 4791 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307254 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307265 4791 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307276 4791 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307287 4791 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307299 4791 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307365 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-cnibin\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307508 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307913 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-multus-conf-dir\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307963 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-host-run-netns\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.308030 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/eab5901c-ba92-4f20-9960-ac7cfd67b25a-cnibin\") pod \"multus-additional-cni-plugins-8stwf\" (UID: \"eab5901c-ba92-4f20-9960-ac7cfd67b25a\") " pod="openshift-multus/multus-additional-cni-plugins-8stwf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.308303 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-hostroot\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.308592 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-multus-cni-dir\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.308617 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-host-var-lib-kubelet\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.308622 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1104c109-74aa-4fc4-8a1b-914a0d5803a4-cni-binary-copy\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.308639 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-etc-kubernetes\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.308668 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-host-run-multus-certs\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.308670 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1b819236-9682-4ef9-8653-516f45335793-hosts-file\") pod \"node-resolver-dl4gt\" (UID: \"1b819236-9682-4ef9-8653-516f45335793\") " pod="openshift-dns/node-resolver-dl4gt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.309013 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/eab5901c-ba92-4f20-9960-ac7cfd67b25a-os-release\") pod \"multus-additional-cni-plugins-8stwf\" (UID: \"eab5901c-ba92-4f20-9960-ac7cfd67b25a\") " pod="openshift-multus/multus-additional-cni-plugins-8stwf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.309904 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.310734 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/02a3a228-86d6-4d54-ad63-0d36c9d59af5-mcd-auth-proxy-config\") pod \"machine-config-daemon-9klkw\" (UID: \"02a3a228-86d6-4d54-ad63-0d36c9d59af5\") " pod="openshift-machine-config-operator/machine-config-daemon-9klkw" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.310785 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-multus-socket-dir-parent\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.313378 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-host-var-lib-cni-multus\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.314134 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-system-cni-dir\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.314733 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-os-release\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.314741 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/eab5901c-ba92-4f20-9960-ac7cfd67b25a-cni-binary-copy\") pod \"multus-additional-cni-plugins-8stwf\" (UID: \"eab5901c-ba92-4f20-9960-ac7cfd67b25a\") " pod="openshift-multus/multus-additional-cni-plugins-8stwf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.314829 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.314893 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.314933 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-host-run-k8s-cni-cncf-io\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.315038 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/eab5901c-ba92-4f20-9960-ac7cfd67b25a-system-cni-dir\") pod \"multus-additional-cni-plugins-8stwf\" (UID: \"eab5901c-ba92-4f20-9960-ac7cfd67b25a\") " pod="openshift-multus/multus-additional-cni-plugins-8stwf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.315048 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.315126 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/02a3a228-86d6-4d54-ad63-0d36c9d59af5-rootfs\") pod \"machine-config-daemon-9klkw\" (UID: \"02a3a228-86d6-4d54-ad63-0d36c9d59af5\") " pod="openshift-machine-config-operator/machine-config-daemon-9klkw" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.315176 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-host-var-lib-cni-bin\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.316334 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/eab5901c-ba92-4f20-9960-ac7cfd67b25a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8stwf\" (UID: \"eab5901c-ba92-4f20-9960-ac7cfd67b25a\") " pod="openshift-multus/multus-additional-cni-plugins-8stwf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.316838 4791 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.317445 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.317622 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.322473 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.323564 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.323727 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1104c109-74aa-4fc4-8a1b-914a0d5803a4-multus-daemon-config\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.323933 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/eab5901c-ba92-4f20-9960-ac7cfd67b25a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8stwf\" (UID: \"eab5901c-ba92-4f20-9960-ac7cfd67b25a\") " pod="openshift-multus/multus-additional-cni-plugins-8stwf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.323997 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.325135 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.325405 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/02a3a228-86d6-4d54-ad63-0d36c9d59af5-proxy-tls\") pod \"machine-config-daemon-9klkw\" (UID: \"02a3a228-86d6-4d54-ad63-0d36c9d59af5\") " pod="openshift-machine-config-operator/machine-config-daemon-9klkw" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.326384 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.334950 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.336001 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.338638 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rswnq\" (UniqueName: \"kubernetes.io/projected/1104c109-74aa-4fc4-8a1b-914a0d5803a4-kube-api-access-rswnq\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.339256 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.340294 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.342846 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nwqf\" (UniqueName: \"kubernetes.io/projected/eab5901c-ba92-4f20-9960-ac7cfd67b25a-kube-api-access-5nwqf\") pod \"multus-additional-cni-plugins-8stwf\" (UID: \"eab5901c-ba92-4f20-9960-ac7cfd67b25a\") " pod="openshift-multus/multus-additional-cni-plugins-8stwf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.342979 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4cf8\" (UniqueName: \"kubernetes.io/projected/1b819236-9682-4ef9-8653-516f45335793-kube-api-access-l4cf8\") pod \"node-resolver-dl4gt\" (UID: \"1b819236-9682-4ef9-8653-516f45335793\") " pod="openshift-dns/node-resolver-dl4gt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.346665 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.347468 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.349013 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.349195 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6rgm\" (UniqueName: \"kubernetes.io/projected/02a3a228-86d6-4d54-ad63-0d36c9d59af5-kube-api-access-c6rgm\") pod \"machine-config-daemon-9klkw\" (UID: \"02a3a228-86d6-4d54-ad63-0d36c9d59af5\") " pod="openshift-machine-config-operator/machine-config-daemon-9klkw" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.349789 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.352447 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.354844 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.355650 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.361142 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.361789 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.363412 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.363993 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.365037 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.365798 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.366654 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.366979 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.367723 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.368321 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.376873 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.378993 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef"} Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.378988 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.386865 4791 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-crc\" already exists" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.396367 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.405018 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hldzt"] Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.407971 4791 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.408001 4791 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.408311 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.410690 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.410800 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.410818 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.410881 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.411699 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.411966 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.412321 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.412530 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.421921 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.432412 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.442559 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.451787 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.457929 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.459407 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.466709 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-dl4gt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.472658 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.480323 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.487051 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.489383 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-8stwf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.496373 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.507739 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.509076 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-slash\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.509230 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-run-openvswitch\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.509315 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-kubelet\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.509393 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-var-lib-openvswitch\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.509488 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-cni-netd\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.509635 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-node-log\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.509668 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-cni-bin\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.509694 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-etc-openvswitch\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.509720 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-run-systemd\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.509742 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-run-ovn-kubernetes\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.509792 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.509835 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e7fe508f-1e8c-4da7-8f99-108e73cb3791-ovnkube-script-lib\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.509874 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e7fe508f-1e8c-4da7-8f99-108e73cb3791-ovnkube-config\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.509893 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e7fe508f-1e8c-4da7-8f99-108e73cb3791-env-overrides\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.509951 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r26vg\" (UniqueName: \"kubernetes.io/projected/e7fe508f-1e8c-4da7-8f99-108e73cb3791-kube-api-access-r26vg\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.509987 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-run-netns\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.510014 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e7fe508f-1e8c-4da7-8f99-108e73cb3791-ovn-node-metrics-cert\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.510029 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-systemd-units\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.510044 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-run-ovn\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.510057 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-log-socket\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.512490 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.522101 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.531436 4791 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-17 00:01:02 +0000 UTC, rotation deadline is 2027-01-01 21:02:45.135846023 +0000 UTC Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.531499 4791 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7652h56m41.604349062s for next certificate rotation Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.537967 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: W0217 00:06:03.549593 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-6fb824bd106246af15923e4ceb716e56fbb9cb42b94c8ec09243aa18bbd00465 WatchSource:0}: Error finding container 6fb824bd106246af15923e4ceb716e56fbb9cb42b94c8ec09243aa18bbd00465: Status 404 returned error can't find the container with id 6fb824bd106246af15923e4ceb716e56fbb9cb42b94c8ec09243aa18bbd00465 Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.564480 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.569890 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.581821 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.604904 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.611385 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-run-netns\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.611531 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e7fe508f-1e8c-4da7-8f99-108e73cb3791-env-overrides\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.611560 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r26vg\" (UniqueName: \"kubernetes.io/projected/e7fe508f-1e8c-4da7-8f99-108e73cb3791-kube-api-access-r26vg\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.611478 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-run-netns\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.611641 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e7fe508f-1e8c-4da7-8f99-108e73cb3791-ovn-node-metrics-cert\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612029 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-systemd-units\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612057 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-run-ovn\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612099 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-log-socket\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612213 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-systemd-units\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612240 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-run-ovn\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612321 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-slash\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612354 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-run-openvswitch\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612325 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-log-socket\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612379 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-cni-netd\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612375 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-slash\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612402 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-run-openvswitch\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612415 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e7fe508f-1e8c-4da7-8f99-108e73cb3791-env-overrides\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612452 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-kubelet\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612423 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-kubelet\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612512 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-var-lib-openvswitch\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612428 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-cni-netd\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612563 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-var-lib-openvswitch\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612568 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-node-log\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612596 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-etc-openvswitch\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612624 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-cni-bin\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612666 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-etc-openvswitch\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612674 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-run-systemd\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612701 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-run-systemd\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612720 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e7fe508f-1e8c-4da7-8f99-108e73cb3791-ovnkube-script-lib\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612734 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-cni-bin\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612742 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-run-ovn-kubernetes\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612783 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612810 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e7fe508f-1e8c-4da7-8f99-108e73cb3791-ovnkube-config\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.613735 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e7fe508f-1e8c-4da7-8f99-108e73cb3791-ovnkube-config\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612633 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-node-log\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.613825 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-run-ovn-kubernetes\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.613879 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.616537 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e7fe508f-1e8c-4da7-8f99-108e73cb3791-ovnkube-script-lib\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.619447 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e7fe508f-1e8c-4da7-8f99-108e73cb3791-ovn-node-metrics-cert\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.637560 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.663877 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.666824 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r26vg\" (UniqueName: \"kubernetes.io/projected/e7fe508f-1e8c-4da7-8f99-108e73cb3791-kube-api-access-r26vg\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.688477 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.700528 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.713765 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.713973 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:06:04.713955016 +0000 UTC m=+22.193467543 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.713957 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.720651 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.726941 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.737154 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.749781 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.764197 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.776412 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.788093 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.801446 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.817007 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.817066 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.817125 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.817160 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.817318 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.817338 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.817351 4791 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.817405 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:04.81738325 +0000 UTC m=+22.296895777 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.817463 4791 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.817494 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:04.817486984 +0000 UTC m=+22.296999511 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.817547 4791 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.817571 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.817596 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.817607 4791 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.817585 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:04.817574047 +0000 UTC m=+22.297086574 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.817866 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:04.817851415 +0000 UTC m=+22.297363942 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.822862 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.844720 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.853965 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.863060 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.160659 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 16:25:44.807070515 +0000 UTC Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.382795 4791 generic.go:334] "Generic (PLEG): container finished" podID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerID="880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6" exitCode=0 Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.382896 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerDied","Data":"880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6"} Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.382953 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerStarted","Data":"8359c5871ee1aee2d63af5dec0cce97a0b6622d7bd312c2093b490d8e6067659"} Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.384870 4791 generic.go:334] "Generic (PLEG): container finished" podID="eab5901c-ba92-4f20-9960-ac7cfd67b25a" containerID="40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c" exitCode=0 Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.384950 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" event={"ID":"eab5901c-ba92-4f20-9960-ac7cfd67b25a","Type":"ContainerDied","Data":"40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c"} Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.384996 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" event={"ID":"eab5901c-ba92-4f20-9960-ac7cfd67b25a","Type":"ContainerStarted","Data":"15db0927244844c736c031a7899f4eb3cbe334b39369dcf8dcdbcca675203ee9"} Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.386441 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04"} Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.386491 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"c65323782cfc3a851ec1f29e3d8a508c8f6cb90f787b2f4a3959638d4e1d03e3"} Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.388491 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-299s7" event={"ID":"1104c109-74aa-4fc4-8a1b-914a0d5803a4","Type":"ContainerStarted","Data":"de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7"} Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.388544 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-299s7" event={"ID":"1104c109-74aa-4fc4-8a1b-914a0d5803a4","Type":"ContainerStarted","Data":"0504e7daa6d550e6b0ea30bcfb0365273e8a7d32d024bc1f1a472355f9e18036"} Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.389555 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d"} Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.389583 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27"} Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.389596 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"6fb824bd106246af15923e4ceb716e56fbb9cb42b94c8ec09243aa18bbd00465"} Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.390814 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" event={"ID":"02a3a228-86d6-4d54-ad63-0d36c9d59af5","Type":"ContainerStarted","Data":"68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add"} Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.390858 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" event={"ID":"02a3a228-86d6-4d54-ad63-0d36c9d59af5","Type":"ContainerStarted","Data":"7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28"} Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.390878 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" event={"ID":"02a3a228-86d6-4d54-ad63-0d36c9d59af5","Type":"ContainerStarted","Data":"55a7a6fbeb41808509bc1dbb654a2f37dc480cdd37aa343bf72f411031a80257"} Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.391523 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"540ca4fc76207bc22c76b263b8d30c65345cbebf01b8f18aa8f525088ed777ae"} Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.393687 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-dl4gt" event={"ID":"1b819236-9682-4ef9-8653-516f45335793","Type":"ContainerStarted","Data":"68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265"} Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.393718 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-dl4gt" event={"ID":"1b819236-9682-4ef9-8653-516f45335793","Type":"ContainerStarted","Data":"15c0c51a7ca8fe26a9cfd09443f9ae3f4990df41e295376a8c1a10384ed8d9c6"} Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.394098 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.409647 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.419315 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.430616 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.440877 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.448909 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.461797 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.471550 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.479905 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.495758 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:04Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.514220 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:04Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.532695 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:04Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.545686 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:04Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.560450 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:04Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.573280 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:04Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.585631 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:04Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.599809 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:04Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.613730 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:04Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.628255 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:04Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.646194 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:04Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.664674 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:04Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.674344 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:04Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.691732 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:04Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.705249 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:04Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.718074 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:04Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.725220 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:06:04 crc kubenswrapper[4791]: E0217 00:06:04.725412 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:06:06.725386644 +0000 UTC m=+24.204899211 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.734440 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:04Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.749208 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:04Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.826644 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.826692 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.826717 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.826746 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:04 crc kubenswrapper[4791]: E0217 00:06:04.826847 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 00:06:04 crc kubenswrapper[4791]: E0217 00:06:04.826873 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 00:06:04 crc kubenswrapper[4791]: E0217 00:06:04.826887 4791 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:04 crc kubenswrapper[4791]: E0217 00:06:04.826892 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 00:06:04 crc kubenswrapper[4791]: E0217 00:06:04.826888 4791 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 00:06:04 crc kubenswrapper[4791]: E0217 00:06:04.826927 4791 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 00:06:04 crc kubenswrapper[4791]: E0217 00:06:04.826910 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 00:06:04 crc kubenswrapper[4791]: E0217 00:06:04.826991 4791 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:04 crc kubenswrapper[4791]: E0217 00:06:04.826946 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:06.826929418 +0000 UTC m=+24.306441955 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:04 crc kubenswrapper[4791]: E0217 00:06:04.827029 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:06.827007771 +0000 UTC m=+24.306520398 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 00:06:04 crc kubenswrapper[4791]: E0217 00:06:04.827048 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:06.827038032 +0000 UTC m=+24.306550669 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 00:06:04 crc kubenswrapper[4791]: E0217 00:06:04.827073 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:06.827064953 +0000 UTC m=+24.306577620 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.161132 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 13:08:04.306743855 +0000 UTC Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.209148 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-k5kxc"] Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.209457 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-k5kxc" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.211548 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.211585 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.212276 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.219614 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.219650 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.219662 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.219725 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:05 crc kubenswrapper[4791]: E0217 00:06:05.219798 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:05 crc kubenswrapper[4791]: E0217 00:06:05.219964 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:05 crc kubenswrapper[4791]: E0217 00:06:05.220441 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.225972 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.228373 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.230911 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.232382 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.233851 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.240966 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.255536 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.269411 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.283209 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.296607 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.309683 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.330586 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5f0a7811-6a89-456b-95ea-6c8e698479dd-serviceca\") pod \"node-ca-k5kxc\" (UID: \"5f0a7811-6a89-456b-95ea-6c8e698479dd\") " pod="openshift-image-registry/node-ca-k5kxc" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.330635 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnb2k\" (UniqueName: \"kubernetes.io/projected/5f0a7811-6a89-456b-95ea-6c8e698479dd-kube-api-access-mnb2k\") pod \"node-ca-k5kxc\" (UID: \"5f0a7811-6a89-456b-95ea-6c8e698479dd\") " pod="openshift-image-registry/node-ca-k5kxc" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.330716 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f0a7811-6a89-456b-95ea-6c8e698479dd-host\") pod \"node-ca-k5kxc\" (UID: \"5f0a7811-6a89-456b-95ea-6c8e698479dd\") " pod="openshift-image-registry/node-ca-k5kxc" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.332791 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.346557 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.375162 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.386222 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.400766 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerStarted","Data":"12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1"} Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.401155 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerStarted","Data":"0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55"} Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.401173 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerStarted","Data":"74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87"} Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.401186 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerStarted","Data":"e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e"} Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.401200 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerStarted","Data":"142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932"} Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.402832 4791 generic.go:334] "Generic (PLEG): container finished" podID="eab5901c-ba92-4f20-9960-ac7cfd67b25a" containerID="c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737" exitCode=0 Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.402918 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" event={"ID":"eab5901c-ba92-4f20-9960-ac7cfd67b25a","Type":"ContainerDied","Data":"c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737"} Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.409617 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.431826 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f0a7811-6a89-456b-95ea-6c8e698479dd-host\") pod \"node-ca-k5kxc\" (UID: \"5f0a7811-6a89-456b-95ea-6c8e698479dd\") " pod="openshift-image-registry/node-ca-k5kxc" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.431906 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5f0a7811-6a89-456b-95ea-6c8e698479dd-serviceca\") pod \"node-ca-k5kxc\" (UID: \"5f0a7811-6a89-456b-95ea-6c8e698479dd\") " pod="openshift-image-registry/node-ca-k5kxc" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.431930 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnb2k\" (UniqueName: \"kubernetes.io/projected/5f0a7811-6a89-456b-95ea-6c8e698479dd-kube-api-access-mnb2k\") pod \"node-ca-k5kxc\" (UID: \"5f0a7811-6a89-456b-95ea-6c8e698479dd\") " pod="openshift-image-registry/node-ca-k5kxc" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.432233 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f0a7811-6a89-456b-95ea-6c8e698479dd-host\") pod \"node-ca-k5kxc\" (UID: \"5f0a7811-6a89-456b-95ea-6c8e698479dd\") " pod="openshift-image-registry/node-ca-k5kxc" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.433177 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5f0a7811-6a89-456b-95ea-6c8e698479dd-serviceca\") pod \"node-ca-k5kxc\" (UID: \"5f0a7811-6a89-456b-95ea-6c8e698479dd\") " pod="openshift-image-registry/node-ca-k5kxc" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.455567 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.477860 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnb2k\" (UniqueName: \"kubernetes.io/projected/5f0a7811-6a89-456b-95ea-6c8e698479dd-kube-api-access-mnb2k\") pod \"node-ca-k5kxc\" (UID: \"5f0a7811-6a89-456b-95ea-6c8e698479dd\") " pod="openshift-image-registry/node-ca-k5kxc" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.479052 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.512052 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.526498 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.551179 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.566984 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.580750 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.593704 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.609513 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.621939 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.635205 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.671848 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.706240 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.723283 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-k5kxc" Feb 17 00:06:05 crc kubenswrapper[4791]: W0217 00:06:05.741723 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f0a7811_6a89_456b_95ea_6c8e698479dd.slice/crio-5bb4ec47033ef1a3dddd5b3aa0a7ad1a44c3a96eb8ffdef9b7f884640f5155a5 WatchSource:0}: Error finding container 5bb4ec47033ef1a3dddd5b3aa0a7ad1a44c3a96eb8ffdef9b7f884640f5155a5: Status 404 returned error can't find the container with id 5bb4ec47033ef1a3dddd5b3aa0a7ad1a44c3a96eb8ffdef9b7f884640f5155a5 Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.755267 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.787235 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.830243 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.866271 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.162264 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 18:08:46.098403841 +0000 UTC Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.413688 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerStarted","Data":"605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6"} Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.416042 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845"} Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.419332 4791 generic.go:334] "Generic (PLEG): container finished" podID="eab5901c-ba92-4f20-9960-ac7cfd67b25a" containerID="19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee" exitCode=0 Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.419420 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" event={"ID":"eab5901c-ba92-4f20-9960-ac7cfd67b25a","Type":"ContainerDied","Data":"19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee"} Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.421272 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-k5kxc" event={"ID":"5f0a7811-6a89-456b-95ea-6c8e698479dd","Type":"ContainerStarted","Data":"f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9"} Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.421312 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-k5kxc" event={"ID":"5f0a7811-6a89-456b-95ea-6c8e698479dd","Type":"ContainerStarted","Data":"5bb4ec47033ef1a3dddd5b3aa0a7ad1a44c3a96eb8ffdef9b7f884640f5155a5"} Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.442996 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.462997 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.491269 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.511537 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.526032 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.539316 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.552652 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.563263 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.587485 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.597631 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.609268 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.620479 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.639096 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.648931 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.662249 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.674458 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.688948 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.698483 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.710857 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.724392 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.735191 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.742965 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:06:06 crc kubenswrapper[4791]: E0217 00:06:06.743234 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:06:10.743192787 +0000 UTC m=+28.222705344 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.751450 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.755168 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.759841 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.784722 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.808097 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.844676 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:06 crc kubenswrapper[4791]: E0217 00:06:06.844891 4791 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 00:06:06 crc kubenswrapper[4791]: E0217 00:06:06.845140 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:10.845098463 +0000 UTC m=+28.324610990 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 00:06:06 crc kubenswrapper[4791]: E0217 00:06:06.845149 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 00:06:06 crc kubenswrapper[4791]: E0217 00:06:06.845169 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 00:06:06 crc kubenswrapper[4791]: E0217 00:06:06.845182 4791 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:06 crc kubenswrapper[4791]: E0217 00:06:06.845225 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:10.845211167 +0000 UTC m=+28.324723694 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.845035 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.845255 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.845276 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:06 crc kubenswrapper[4791]: E0217 00:06:06.845347 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 00:06:06 crc kubenswrapper[4791]: E0217 00:06:06.845356 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 00:06:06 crc kubenswrapper[4791]: E0217 00:06:06.845364 4791 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:06 crc kubenswrapper[4791]: E0217 00:06:06.845383 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:10.845377372 +0000 UTC m=+28.324889889 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:06 crc kubenswrapper[4791]: E0217 00:06:06.845427 4791 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 00:06:06 crc kubenswrapper[4791]: E0217 00:06:06.845456 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:10.845451145 +0000 UTC m=+28.324963672 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.850291 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.901008 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.924895 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.973753 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.012414 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.019515 4791 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.072806 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.104345 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.147257 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.162840 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 22:40:58.886449551 +0000 UTC Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.189879 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.219698 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.219749 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:07 crc kubenswrapper[4791]: E0217 00:06:07.220149 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.219796 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:07 crc kubenswrapper[4791]: E0217 00:06:07.220006 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:07 crc kubenswrapper[4791]: E0217 00:06:07.220412 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.230079 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.265611 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.308408 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.346691 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.387305 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.427359 4791 generic.go:334] "Generic (PLEG): container finished" podID="eab5901c-ba92-4f20-9960-ac7cfd67b25a" containerID="74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08" exitCode=0 Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.427659 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" event={"ID":"eab5901c-ba92-4f20-9960-ac7cfd67b25a","Type":"ContainerDied","Data":"74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08"} Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.432529 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.468724 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.509332 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.550833 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.589156 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.629332 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.671615 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.706813 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.749860 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.790601 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.826089 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.869024 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.908561 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.949470 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.991705 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.034479 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.070834 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.122329 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.148696 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.163431 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 00:40:40.45421499 +0000 UTC Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.203449 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.228912 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.435235 4791 generic.go:334] "Generic (PLEG): container finished" podID="eab5901c-ba92-4f20-9960-ac7cfd67b25a" containerID="1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90" exitCode=0 Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.435333 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" event={"ID":"eab5901c-ba92-4f20-9960-ac7cfd67b25a","Type":"ContainerDied","Data":"1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90"} Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.441878 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerStarted","Data":"ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3"} Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.465998 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.483613 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.514787 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.531534 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.550427 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.576383 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.589622 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.600170 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.611613 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.625462 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.671990 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.691279 4791 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.693442 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.693490 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.693505 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.693622 4791 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.704381 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.759881 4791 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.760186 4791 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.761212 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.761246 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.761258 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.761275 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.761286 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:08Z","lastTransitionTime":"2026-02-17T00:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:08 crc kubenswrapper[4791]: E0217 00:06:08.771863 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.775219 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.775265 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.775276 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.775292 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.775304 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:08Z","lastTransitionTime":"2026-02-17T00:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:08 crc kubenswrapper[4791]: E0217 00:06:08.787616 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.790747 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.790789 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.790805 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.790828 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.790844 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:08Z","lastTransitionTime":"2026-02-17T00:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.791933 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: E0217 00:06:08.810874 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.814033 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.814074 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.814086 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.814127 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.814142 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:08Z","lastTransitionTime":"2026-02-17T00:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:08 crc kubenswrapper[4791]: E0217 00:06:08.826702 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.829515 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.830261 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.830291 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.830300 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.830315 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.830326 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:08Z","lastTransitionTime":"2026-02-17T00:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:08 crc kubenswrapper[4791]: E0217 00:06:08.842924 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: E0217 00:06:08.843026 4791 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.844538 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.844580 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.844592 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.844610 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.844622 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:08Z","lastTransitionTime":"2026-02-17T00:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.882922 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.946758 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.946831 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.946851 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.946877 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.946898 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:08Z","lastTransitionTime":"2026-02-17T00:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.049981 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.050029 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.050042 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.050059 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.050070 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:09Z","lastTransitionTime":"2026-02-17T00:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.153156 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.153223 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.153242 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.153268 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.153288 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:09Z","lastTransitionTime":"2026-02-17T00:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.164379 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 16:55:15.747774279 +0000 UTC Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.219463 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.219473 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:09 crc kubenswrapper[4791]: E0217 00:06:09.219610 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.219483 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:09 crc kubenswrapper[4791]: E0217 00:06:09.219667 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:09 crc kubenswrapper[4791]: E0217 00:06:09.219783 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.256324 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.256436 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.256457 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.256484 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.256504 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:09Z","lastTransitionTime":"2026-02-17T00:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.359718 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.359758 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.359769 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.359786 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.359798 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:09Z","lastTransitionTime":"2026-02-17T00:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.455097 4791 generic.go:334] "Generic (PLEG): container finished" podID="eab5901c-ba92-4f20-9960-ac7cfd67b25a" containerID="a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6" exitCode=0 Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.455352 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" event={"ID":"eab5901c-ba92-4f20-9960-ac7cfd67b25a","Type":"ContainerDied","Data":"a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6"} Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.464583 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.464651 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.464675 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.464705 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.464728 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:09Z","lastTransitionTime":"2026-02-17T00:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.482482 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:09Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.508934 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:09Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.540168 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:09Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.560763 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:09Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.567867 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.567915 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.567933 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.567955 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.567984 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:09Z","lastTransitionTime":"2026-02-17T00:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.582652 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:09Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.601032 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:09Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.630591 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:09Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.645967 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:09Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.667616 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:09Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.670867 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.670925 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.670941 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.670972 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.670991 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:09Z","lastTransitionTime":"2026-02-17T00:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.688197 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:09Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.716631 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:09Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.735971 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:09Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.758577 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:09Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.773414 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.773522 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.773545 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.773572 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.773593 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:09Z","lastTransitionTime":"2026-02-17T00:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.778496 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:09Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.803779 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:09Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.876210 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.876256 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.876271 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.876295 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.876313 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:09Z","lastTransitionTime":"2026-02-17T00:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.979053 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.979137 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.979158 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.979182 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.979199 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:09Z","lastTransitionTime":"2026-02-17T00:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.082186 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.082234 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.082254 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.082279 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.082296 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:10Z","lastTransitionTime":"2026-02-17T00:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.165405 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 03:26:00.753948679 +0000 UTC Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.185809 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.185863 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.185886 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.185917 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.185939 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:10Z","lastTransitionTime":"2026-02-17T00:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.289956 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.290006 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.290022 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.290044 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.290064 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:10Z","lastTransitionTime":"2026-02-17T00:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.392973 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.393035 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.393051 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.393078 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.393098 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:10Z","lastTransitionTime":"2026-02-17T00:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.468446 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerStarted","Data":"d5b2767c44ba6d74dcdd85feb06a26a33340399b11e454c22e85d3a9c26c4eaa"} Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.469061 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.475450 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" event={"ID":"eab5901c-ba92-4f20-9960-ac7cfd67b25a","Type":"ContainerStarted","Data":"dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c"} Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.493530 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.496891 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.496929 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.496951 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.496977 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.496994 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:10Z","lastTransitionTime":"2026-02-17T00:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.514527 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.516414 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.534442 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.548878 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.563372 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.581235 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.598401 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.599826 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.599868 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.599880 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.599898 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.599910 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:10Z","lastTransitionTime":"2026-02-17T00:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.612236 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.622756 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.643247 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b2767c44ba6d74dcdd85feb06a26a33340399b11e454c22e85d3a9c26c4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.655960 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.691848 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.703017 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.703084 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.703102 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.703166 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.703231 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:10Z","lastTransitionTime":"2026-02-17T00:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.706284 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.725158 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.742675 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.763674 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.784324 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.786930 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:06:10 crc kubenswrapper[4791]: E0217 00:06:10.787148 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:06:18.787098147 +0000 UTC m=+36.266610714 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.804786 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.813245 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.813309 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.813321 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.813338 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.813350 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:10Z","lastTransitionTime":"2026-02-17T00:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.833901 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.857369 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.877750 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.888808 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.888913 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.888955 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.888990 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:10 crc kubenswrapper[4791]: E0217 00:06:10.889066 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 00:06:10 crc kubenswrapper[4791]: E0217 00:06:10.889101 4791 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 00:06:10 crc kubenswrapper[4791]: E0217 00:06:10.889149 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 00:06:10 crc kubenswrapper[4791]: E0217 00:06:10.889181 4791 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:10 crc kubenswrapper[4791]: E0217 00:06:10.889221 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:18.889199399 +0000 UTC m=+36.368711956 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 00:06:10 crc kubenswrapper[4791]: E0217 00:06:10.889232 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 00:06:10 crc kubenswrapper[4791]: E0217 00:06:10.889263 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:18.88923525 +0000 UTC m=+36.368747817 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:10 crc kubenswrapper[4791]: E0217 00:06:10.889277 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 00:06:10 crc kubenswrapper[4791]: E0217 00:06:10.889341 4791 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:10 crc kubenswrapper[4791]: E0217 00:06:10.889367 4791 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 00:06:10 crc kubenswrapper[4791]: E0217 00:06:10.889433 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:18.889403165 +0000 UTC m=+36.368915732 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:10 crc kubenswrapper[4791]: E0217 00:06:10.889520 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:18.889475588 +0000 UTC m=+36.368988185 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.899032 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.915879 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.915937 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.915955 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.915984 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.916001 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:10Z","lastTransitionTime":"2026-02-17T00:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.919500 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.934814 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.967977 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.980489 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.996756 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.015035 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.018853 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.018903 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.018922 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.018947 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.018964 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:11Z","lastTransitionTime":"2026-02-17T00:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.042694 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b2767c44ba6d74dcdd85feb06a26a33340399b11e454c22e85d3a9c26c4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.057995 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.121863 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.121923 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.121940 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.121964 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.121983 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:11Z","lastTransitionTime":"2026-02-17T00:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.166569 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 14:49:48.121724201 +0000 UTC Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.219433 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.219586 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:11 crc kubenswrapper[4791]: E0217 00:06:11.219635 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.219712 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:11 crc kubenswrapper[4791]: E0217 00:06:11.219886 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:11 crc kubenswrapper[4791]: E0217 00:06:11.220029 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.224485 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.224551 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.224570 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.224592 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.224608 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:11Z","lastTransitionTime":"2026-02-17T00:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.365384 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.365427 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.365438 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.365455 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.365468 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:11Z","lastTransitionTime":"2026-02-17T00:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.468099 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.468191 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.468207 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.468233 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.468254 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:11Z","lastTransitionTime":"2026-02-17T00:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.479287 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.479322 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.516351 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.534476 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.546725 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.564899 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.570556 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.570591 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.570602 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.570617 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.570628 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:11Z","lastTransitionTime":"2026-02-17T00:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.586839 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.603969 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.620137 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.652778 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.669252 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.674382 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.674421 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.674439 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.674463 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.674482 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:11Z","lastTransitionTime":"2026-02-17T00:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.699103 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.717279 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.750046 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b2767c44ba6d74dcdd85feb06a26a33340399b11e454c22e85d3a9c26c4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.766590 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.776993 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.777043 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.777060 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.777083 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.777101 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:11Z","lastTransitionTime":"2026-02-17T00:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.788157 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.797878 4791 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.806327 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.828646 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.880082 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.880169 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.880187 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.880213 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.880230 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:11Z","lastTransitionTime":"2026-02-17T00:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.983931 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.983978 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.983996 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.984019 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.984036 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:11Z","lastTransitionTime":"2026-02-17T00:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.041452 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.056439 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:12Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.074465 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:12Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.104584 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.104626 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.104645 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.104668 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.104684 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:12Z","lastTransitionTime":"2026-02-17T00:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.126563 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:12Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.154937 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:12Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.166759 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:12Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.166986 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 11:20:56.492612949 +0000 UTC Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.179527 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:12Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.189539 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:12Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.206891 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:12Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.206979 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.207161 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.207174 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.207193 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.207204 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:12Z","lastTransitionTime":"2026-02-17T00:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.216608 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:12Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.229544 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:12Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.240852 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:12Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.258376 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b2767c44ba6d74dcdd85feb06a26a33340399b11e454c22e85d3a9c26c4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:12Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.272784 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:12Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.285340 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:12Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.304928 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:12Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.309332 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.309381 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.309393 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.309411 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.309423 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:12Z","lastTransitionTime":"2026-02-17T00:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.411916 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.411955 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.411966 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.411983 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.411994 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:12Z","lastTransitionTime":"2026-02-17T00:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.514351 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.514409 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.514426 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.514451 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.514467 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:12Z","lastTransitionTime":"2026-02-17T00:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.617537 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.617925 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.618087 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.618267 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.618391 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:12Z","lastTransitionTime":"2026-02-17T00:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.721158 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.721199 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.721207 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.721220 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.721229 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:12Z","lastTransitionTime":"2026-02-17T00:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.823210 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.823240 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.823249 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.823261 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.823270 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:12Z","lastTransitionTime":"2026-02-17T00:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.925617 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.925655 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.925665 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.925680 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.925693 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:12Z","lastTransitionTime":"2026-02-17T00:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.939885 4791 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.030044 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.030153 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.030176 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.030204 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.030234 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:13Z","lastTransitionTime":"2026-02-17T00:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.133155 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.133202 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.133229 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.133242 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.133251 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:13Z","lastTransitionTime":"2026-02-17T00:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.167941 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 02:47:14.519648344 +0000 UTC Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.219312 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.219355 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:13 crc kubenswrapper[4791]: E0217 00:06:13.219453 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.219502 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:13 crc kubenswrapper[4791]: E0217 00:06:13.219691 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:13 crc kubenswrapper[4791]: E0217 00:06:13.219866 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.235453 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.235485 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.235496 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.235513 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.235525 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:13Z","lastTransitionTime":"2026-02-17T00:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.245081 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.258266 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.277950 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.292583 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.305552 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.319372 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.338058 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.338162 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.338181 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.338212 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.338234 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:13Z","lastTransitionTime":"2026-02-17T00:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.343230 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b2767c44ba6d74dcdd85feb06a26a33340399b11e454c22e85d3a9c26c4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.359612 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.392746 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.408554 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.430025 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.441375 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.441421 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.441439 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.441455 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.441466 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:13Z","lastTransitionTime":"2026-02-17T00:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.444896 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.462669 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.480856 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.486395 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hldzt_e7fe508f-1e8c-4da7-8f99-108e73cb3791/ovnkube-controller/0.log" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.489896 4791 generic.go:334] "Generic (PLEG): container finished" podID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerID="d5b2767c44ba6d74dcdd85feb06a26a33340399b11e454c22e85d3a9c26c4eaa" exitCode=1 Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.489949 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerDied","Data":"d5b2767c44ba6d74dcdd85feb06a26a33340399b11e454c22e85d3a9c26c4eaa"} Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.490533 4791 scope.go:117] "RemoveContainer" containerID="d5b2767c44ba6d74dcdd85feb06a26a33340399b11e454c22e85d3a9c26c4eaa" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.504741 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.522681 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.536580 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.548313 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.548355 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.548368 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.548387 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.548398 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:13Z","lastTransitionTime":"2026-02-17T00:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.554878 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.571745 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.587191 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.607281 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.625899 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.647027 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.651946 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.651989 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.652006 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.652032 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.652049 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:13Z","lastTransitionTime":"2026-02-17T00:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.666356 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.697300 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b2767c44ba6d74dcdd85feb06a26a33340399b11e454c22e85d3a9c26c4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5b2767c44ba6d74dcdd85feb06a26a33340399b11e454c22e85d3a9c26c4eaa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:13Z\\\",\\\"message\\\":\\\"olicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:13.113082 6064 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0217 00:06:13.113171 6064 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0217 00:06:13.113182 6064 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0217 00:06:13.113204 6064 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 00:06:13.113225 6064 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 00:06:13.113237 6064 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0217 00:06:13.113241 6064 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0217 00:06:13.113253 6064 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0217 00:06:13.113256 6064 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 00:06:13.113261 6064 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 00:06:13.113258 6064 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 00:06:13.113268 6064 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 00:06:13.113284 6064 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 00:06:13.113322 6064 factory.go:656] Stopping watch factory\\\\nI0217 00:06:13.113339 6064 ovnkube.go:599] Stopped ovnkube\\\\nI0217 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.720422 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.752492 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.754580 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.754615 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.754630 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.754654 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.754672 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:13Z","lastTransitionTime":"2026-02-17T00:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.772617 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.789179 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.811580 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.857285 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.857349 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.857367 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.857393 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.857412 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:13Z","lastTransitionTime":"2026-02-17T00:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.960000 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.960040 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.960048 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.960065 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.960075 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:13Z","lastTransitionTime":"2026-02-17T00:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.062890 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.062948 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.062960 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.062980 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.062994 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:14Z","lastTransitionTime":"2026-02-17T00:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.166586 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.166640 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.166653 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.166669 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.166679 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:14Z","lastTransitionTime":"2026-02-17T00:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.168985 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 11:11:46.687364073 +0000 UTC Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.269436 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.269501 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.269520 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.269547 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.269567 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:14Z","lastTransitionTime":"2026-02-17T00:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.372956 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.373044 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.373065 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.373088 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.373128 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:14Z","lastTransitionTime":"2026-02-17T00:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.475800 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.475857 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.475875 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.475899 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.475917 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:14Z","lastTransitionTime":"2026-02-17T00:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.496429 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hldzt_e7fe508f-1e8c-4da7-8f99-108e73cb3791/ovnkube-controller/1.log" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.496978 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hldzt_e7fe508f-1e8c-4da7-8f99-108e73cb3791/ovnkube-controller/0.log" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.500479 4791 generic.go:334] "Generic (PLEG): container finished" podID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerID="0447c886ed6f06f1f58963535c53b2de3e27e1cbb4b9210cd507691736736f40" exitCode=1 Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.500515 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerDied","Data":"0447c886ed6f06f1f58963535c53b2de3e27e1cbb4b9210cd507691736736f40"} Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.500558 4791 scope.go:117] "RemoveContainer" containerID="d5b2767c44ba6d74dcdd85feb06a26a33340399b11e454c22e85d3a9c26c4eaa" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.502542 4791 scope.go:117] "RemoveContainer" containerID="0447c886ed6f06f1f58963535c53b2de3e27e1cbb4b9210cd507691736736f40" Feb 17 00:06:14 crc kubenswrapper[4791]: E0217 00:06:14.502811 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.526384 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:14Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.543685 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:14Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.563583 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:14Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.578905 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.578966 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.578984 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.579010 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.579030 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:14Z","lastTransitionTime":"2026-02-17T00:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.579756 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:14Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.596871 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:14Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.612979 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:14Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.648286 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:14Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.664983 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:14Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.682099 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.682305 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.682331 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.682366 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.682390 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:14Z","lastTransitionTime":"2026-02-17T00:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.691269 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:14Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.713316 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:14Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.743806 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0447c886ed6f06f1f58963535c53b2de3e27e1cbb4b9210cd507691736736f40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5b2767c44ba6d74dcdd85feb06a26a33340399b11e454c22e85d3a9c26c4eaa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:13Z\\\",\\\"message\\\":\\\"olicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:13.113082 6064 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0217 00:06:13.113171 6064 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0217 00:06:13.113182 6064 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0217 00:06:13.113204 6064 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 00:06:13.113225 6064 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 00:06:13.113237 6064 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0217 00:06:13.113241 6064 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0217 00:06:13.113253 6064 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0217 00:06:13.113256 6064 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 00:06:13.113261 6064 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 00:06:13.113258 6064 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 00:06:13.113268 6064 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 00:06:13.113284 6064 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 00:06:13.113322 6064 factory.go:656] Stopping watch factory\\\\nI0217 00:06:13.113339 6064 ovnkube.go:599] Stopped ovnkube\\\\nI0217 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0447c886ed6f06f1f58963535c53b2de3e27e1cbb4b9210cd507691736736f40\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:14Z\\\",\\\"message\\\":\\\"ndler 5 for removal\\\\nI0217 00:06:14.432585 6188 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 00:06:14.432616 6188 factory.go:656] Stopping watch factory\\\\nI0217 00:06:14.432659 6188 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 00:06:14.432596 6188 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 00:06:14.432725 6188 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 00:06:14.432697 6188 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:14.432953 6188 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:14.433009 6188 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:14.433215 6188 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:14.433466 6188 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:14Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.761332 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:14Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.782523 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:14Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.784898 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.784960 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.784979 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.785008 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.785027 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:14Z","lastTransitionTime":"2026-02-17T00:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.802417 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:14Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.825011 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:14Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.887536 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.887592 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.887609 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.887632 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.887650 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:14Z","lastTransitionTime":"2026-02-17T00:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.991371 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.991469 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.991490 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.991513 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.991531 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:14Z","lastTransitionTime":"2026-02-17T00:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.095155 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.095219 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.095239 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.095265 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.095284 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:15Z","lastTransitionTime":"2026-02-17T00:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.169815 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 14:47:19.500728797 +0000 UTC Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.198624 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.198690 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.198709 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.198734 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.198755 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:15Z","lastTransitionTime":"2026-02-17T00:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.220036 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:15 crc kubenswrapper[4791]: E0217 00:06:15.220257 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.220283 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.220354 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:15 crc kubenswrapper[4791]: E0217 00:06:15.220532 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:15 crc kubenswrapper[4791]: E0217 00:06:15.220759 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.301749 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.301818 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.301836 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.301859 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.301876 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:15Z","lastTransitionTime":"2026-02-17T00:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.404681 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.404743 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.404761 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.404786 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.404803 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:15Z","lastTransitionTime":"2026-02-17T00:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.506745 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hldzt_e7fe508f-1e8c-4da7-8f99-108e73cb3791/ovnkube-controller/1.log" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.507216 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.507257 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.507294 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.507318 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.507338 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:15Z","lastTransitionTime":"2026-02-17T00:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.516026 4791 scope.go:117] "RemoveContainer" containerID="0447c886ed6f06f1f58963535c53b2de3e27e1cbb4b9210cd507691736736f40" Feb 17 00:06:15 crc kubenswrapper[4791]: E0217 00:06:15.516357 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.539687 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:15Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.558493 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:15Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.577757 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:15Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.599095 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:15Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.610459 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.610516 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.610537 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.610560 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.610578 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:15Z","lastTransitionTime":"2026-02-17T00:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.619150 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:15Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.641473 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:15Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.664149 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:15Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.688464 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:15Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.702888 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:15Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.715577 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.715650 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.715677 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.715707 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.715733 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:15Z","lastTransitionTime":"2026-02-17T00:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.735529 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0447c886ed6f06f1f58963535c53b2de3e27e1cbb4b9210cd507691736736f40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0447c886ed6f06f1f58963535c53b2de3e27e1cbb4b9210cd507691736736f40\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:14Z\\\",\\\"message\\\":\\\"ndler 5 for removal\\\\nI0217 00:06:14.432585 6188 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 00:06:14.432616 6188 factory.go:656] Stopping watch factory\\\\nI0217 00:06:14.432659 6188 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 00:06:14.432596 6188 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 00:06:14.432725 6188 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 00:06:14.432697 6188 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:14.432953 6188 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:14.433009 6188 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:14.433215 6188 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:14.433466 6188 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:15Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.751296 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:15Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.773936 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:15Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.787418 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:15Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.806588 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:15Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.818315 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.818378 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.818390 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.818408 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.818421 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:15Z","lastTransitionTime":"2026-02-17T00:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.826910 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:15Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.921071 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.921226 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.921241 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.921258 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.921272 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:15Z","lastTransitionTime":"2026-02-17T00:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.025174 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.025549 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.025623 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.025724 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.025794 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:16Z","lastTransitionTime":"2026-02-17T00:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.127343 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq"] Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.128605 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.128671 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.128691 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.128719 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.128740 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:16Z","lastTransitionTime":"2026-02-17T00:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.129013 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.132257 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.132474 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.149757 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:16Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.165385 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:16Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.170769 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 22:40:45.345982918 +0000 UTC Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.186594 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:16Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.205878 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:16Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.224070 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:16Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.231801 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.231865 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.231886 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.231914 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.231934 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:16Z","lastTransitionTime":"2026-02-17T00:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.243074 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:16Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.255940 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1832e521-1715-432d-917c-bc0ab725e92f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-8tdlq\" (UID: \"1832e521-1715-432d-917c-bc0ab725e92f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.256144 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1832e521-1715-432d-917c-bc0ab725e92f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-8tdlq\" (UID: \"1832e521-1715-432d-917c-bc0ab725e92f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.256222 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h57s\" (UniqueName: \"kubernetes.io/projected/1832e521-1715-432d-917c-bc0ab725e92f-kube-api-access-9h57s\") pod \"ovnkube-control-plane-749d76644c-8tdlq\" (UID: \"1832e521-1715-432d-917c-bc0ab725e92f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.256284 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1832e521-1715-432d-917c-bc0ab725e92f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-8tdlq\" (UID: \"1832e521-1715-432d-917c-bc0ab725e92f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.260193 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:16Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.273816 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1832e521-1715-432d-917c-bc0ab725e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8tdlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:16Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.292942 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:16Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.310490 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:16Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.327181 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:16Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.335047 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.335123 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.335138 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.335157 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.335169 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:16Z","lastTransitionTime":"2026-02-17T00:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.346628 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:16Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.356981 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1832e521-1715-432d-917c-bc0ab725e92f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-8tdlq\" (UID: \"1832e521-1715-432d-917c-bc0ab725e92f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.357102 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1832e521-1715-432d-917c-bc0ab725e92f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-8tdlq\" (UID: \"1832e521-1715-432d-917c-bc0ab725e92f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.357261 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1832e521-1715-432d-917c-bc0ab725e92f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-8tdlq\" (UID: \"1832e521-1715-432d-917c-bc0ab725e92f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.357347 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h57s\" (UniqueName: \"kubernetes.io/projected/1832e521-1715-432d-917c-bc0ab725e92f-kube-api-access-9h57s\") pod \"ovnkube-control-plane-749d76644c-8tdlq\" (UID: \"1832e521-1715-432d-917c-bc0ab725e92f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.357909 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1832e521-1715-432d-917c-bc0ab725e92f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-8tdlq\" (UID: \"1832e521-1715-432d-917c-bc0ab725e92f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.357978 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1832e521-1715-432d-917c-bc0ab725e92f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-8tdlq\" (UID: \"1832e521-1715-432d-917c-bc0ab725e92f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.364428 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1832e521-1715-432d-917c-bc0ab725e92f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-8tdlq\" (UID: \"1832e521-1715-432d-917c-bc0ab725e92f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.421179 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h57s\" (UniqueName: \"kubernetes.io/projected/1832e521-1715-432d-917c-bc0ab725e92f-kube-api-access-9h57s\") pod \"ovnkube-control-plane-749d76644c-8tdlq\" (UID: \"1832e521-1715-432d-917c-bc0ab725e92f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.421684 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0447c886ed6f06f1f58963535c53b2de3e27e1cbb4b9210cd507691736736f40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0447c886ed6f06f1f58963535c53b2de3e27e1cbb4b9210cd507691736736f40\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:14Z\\\",\\\"message\\\":\\\"ndler 5 for removal\\\\nI0217 00:06:14.432585 6188 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 00:06:14.432616 6188 factory.go:656] Stopping watch factory\\\\nI0217 00:06:14.432659 6188 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 00:06:14.432596 6188 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 00:06:14.432725 6188 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 00:06:14.432697 6188 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:14.432953 6188 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:14.433009 6188 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:14.433215 6188 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:14.433466 6188 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:16Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.432954 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:16Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.438304 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.438433 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.438457 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.438543 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.438563 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:16Z","lastTransitionTime":"2026-02-17T00:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.442634 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" Feb 17 00:06:16 crc kubenswrapper[4791]: W0217 00:06:16.456997 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1832e521_1715_432d_917c_bc0ab725e92f.slice/crio-6f74ba2385a790a9fdb63bc609b26b66608392c46f9820bfb96ef21b95e157b1 WatchSource:0}: Error finding container 6f74ba2385a790a9fdb63bc609b26b66608392c46f9820bfb96ef21b95e157b1: Status 404 returned error can't find the container with id 6f74ba2385a790a9fdb63bc609b26b66608392c46f9820bfb96ef21b95e157b1 Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.457328 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:16Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.468620 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:16Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.522202 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" event={"ID":"1832e521-1715-432d-917c-bc0ab725e92f","Type":"ContainerStarted","Data":"6f74ba2385a790a9fdb63bc609b26b66608392c46f9820bfb96ef21b95e157b1"} Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.540720 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.540782 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.540802 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.540832 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.540855 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:16Z","lastTransitionTime":"2026-02-17T00:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.643754 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.643797 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.643806 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.643821 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.643831 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:16Z","lastTransitionTime":"2026-02-17T00:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.747483 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.747656 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.747738 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.748599 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.748649 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:16Z","lastTransitionTime":"2026-02-17T00:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.852491 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.852594 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.852616 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.852676 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.852696 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:16Z","lastTransitionTime":"2026-02-17T00:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.955789 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.955841 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.955889 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.955913 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.955930 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:16Z","lastTransitionTime":"2026-02-17T00:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.057945 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.058000 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.058017 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.058042 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.058060 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:17Z","lastTransitionTime":"2026-02-17T00:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.161162 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.161206 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.161223 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.161246 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.161265 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:17Z","lastTransitionTime":"2026-02-17T00:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.171434 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 08:30:23.286603384 +0000 UTC Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.220166 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.220190 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.220177 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:17 crc kubenswrapper[4791]: E0217 00:06:17.220349 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:17 crc kubenswrapper[4791]: E0217 00:06:17.220689 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:17 crc kubenswrapper[4791]: E0217 00:06:17.220839 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.263794 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.263845 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.263859 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.263877 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.263889 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:17Z","lastTransitionTime":"2026-02-17T00:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.366855 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.366886 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.366897 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.366910 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.366918 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:17Z","lastTransitionTime":"2026-02-17T00:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.470403 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.470478 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.470501 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.470527 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.470544 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:17Z","lastTransitionTime":"2026-02-17T00:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.530383 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" event={"ID":"1832e521-1715-432d-917c-bc0ab725e92f","Type":"ContainerStarted","Data":"e0f54cf1746cd7a9cc213dadaf7dc256f06b16d9062fba3ddcfc389cb2ff5dcb"} Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.530447 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" event={"ID":"1832e521-1715-432d-917c-bc0ab725e92f","Type":"ContainerStarted","Data":"6482797ad75c032ff3c59d7bc47afd9c215720de8bb76506421ebb686ee29800"} Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.554740 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.573104 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.573191 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.573210 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.573234 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.573251 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:17Z","lastTransitionTime":"2026-02-17T00:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.574736 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1832e521-1715-432d-917c-bc0ab725e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6482797ad75c032ff3c59d7bc47afd9c215720de8bb76506421ebb686ee29800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f54cf1746cd7a9cc213dadaf7dc256f06b16d9062fba3ddcfc389cb2ff5dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8tdlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.596587 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.617977 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.618997 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-6x28n"] Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.619698 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:17 crc kubenswrapper[4791]: E0217 00:06:17.619799 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.638197 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.675771 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.675828 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.675849 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.675874 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.675893 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:17Z","lastTransitionTime":"2026-02-17T00:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.686680 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.706168 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.727299 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.751674 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.763339 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.771816 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnzq7\" (UniqueName: \"kubernetes.io/projected/1d97cf45-2324-494c-839f-6f264eba3828-kube-api-access-tnzq7\") pod \"network-metrics-daemon-6x28n\" (UID: \"1d97cf45-2324-494c-839f-6f264eba3828\") " pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.771844 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs\") pod \"network-metrics-daemon-6x28n\" (UID: \"1d97cf45-2324-494c-839f-6f264eba3828\") " pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.776067 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.777567 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.777609 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.777622 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.777639 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.777652 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:17Z","lastTransitionTime":"2026-02-17T00:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.794781 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.816926 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0447c886ed6f06f1f58963535c53b2de3e27e1cbb4b9210cd507691736736f40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0447c886ed6f06f1f58963535c53b2de3e27e1cbb4b9210cd507691736736f40\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:14Z\\\",\\\"message\\\":\\\"ndler 5 for removal\\\\nI0217 00:06:14.432585 6188 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 00:06:14.432616 6188 factory.go:656] Stopping watch factory\\\\nI0217 00:06:14.432659 6188 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 00:06:14.432596 6188 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 00:06:14.432725 6188 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 00:06:14.432697 6188 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:14.432953 6188 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:14.433009 6188 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:14.433215 6188 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:14.433466 6188 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.828974 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.844124 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.858932 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.872894 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnzq7\" (UniqueName: \"kubernetes.io/projected/1d97cf45-2324-494c-839f-6f264eba3828-kube-api-access-tnzq7\") pod \"network-metrics-daemon-6x28n\" (UID: \"1d97cf45-2324-494c-839f-6f264eba3828\") " pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.872962 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs\") pod \"network-metrics-daemon-6x28n\" (UID: \"1d97cf45-2324-494c-839f-6f264eba3828\") " pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:17 crc kubenswrapper[4791]: E0217 00:06:17.873140 4791 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 00:06:17 crc kubenswrapper[4791]: E0217 00:06:17.873215 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs podName:1d97cf45-2324-494c-839f-6f264eba3828 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:18.373196368 +0000 UTC m=+35.852708905 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs") pod "network-metrics-daemon-6x28n" (UID: "1d97cf45-2324-494c-839f-6f264eba3828") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.878072 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.879733 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.879789 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.879818 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.879848 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.879873 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:17Z","lastTransitionTime":"2026-02-17T00:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.892473 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.899894 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnzq7\" (UniqueName: \"kubernetes.io/projected/1d97cf45-2324-494c-839f-6f264eba3828-kube-api-access-tnzq7\") pod \"network-metrics-daemon-6x28n\" (UID: \"1d97cf45-2324-494c-839f-6f264eba3828\") " pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.923226 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.942440 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.956364 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.970141 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.983409 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.983489 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.983517 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.983549 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.983571 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:17Z","lastTransitionTime":"2026-02-17T00:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.984340 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.996423 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.013310 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1832e521-1715-432d-917c-bc0ab725e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6482797ad75c032ff3c59d7bc47afd9c215720de8bb76506421ebb686ee29800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f54cf1746cd7a9cc213dadaf7dc256f06b16d9062fba3ddcfc389cb2ff5dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8tdlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:18Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.027496 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:18Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.042674 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:18Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.056351 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:18Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.075205 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:18Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.086860 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.086941 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.086965 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.086995 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.087015 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:18Z","lastTransitionTime":"2026-02-17T00:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.106605 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0447c886ed6f06f1f58963535c53b2de3e27e1cbb4b9210cd507691736736f40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0447c886ed6f06f1f58963535c53b2de3e27e1cbb4b9210cd507691736736f40\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:14Z\\\",\\\"message\\\":\\\"ndler 5 for removal\\\\nI0217 00:06:14.432585 6188 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 00:06:14.432616 6188 factory.go:656] Stopping watch factory\\\\nI0217 00:06:14.432659 6188 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 00:06:14.432596 6188 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 00:06:14.432725 6188 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 00:06:14.432697 6188 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:14.432953 6188 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:14.433009 6188 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:14.433215 6188 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:14.433466 6188 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:18Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.119178 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:18Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.136562 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6x28n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d97cf45-2324-494c-839f-6f264eba3828\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6x28n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:18Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.168232 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:18Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.172444 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 19:30:26.139074475 +0000 UTC Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.189636 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.189697 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.189717 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.189741 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.189758 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:18Z","lastTransitionTime":"2026-02-17T00:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.293212 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.293282 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.293302 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.293326 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.293344 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:18Z","lastTransitionTime":"2026-02-17T00:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.380488 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs\") pod \"network-metrics-daemon-6x28n\" (UID: \"1d97cf45-2324-494c-839f-6f264eba3828\") " pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:18 crc kubenswrapper[4791]: E0217 00:06:18.380774 4791 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 00:06:18 crc kubenswrapper[4791]: E0217 00:06:18.380900 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs podName:1d97cf45-2324-494c-839f-6f264eba3828 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:19.380868847 +0000 UTC m=+36.860381404 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs") pod "network-metrics-daemon-6x28n" (UID: "1d97cf45-2324-494c-839f-6f264eba3828") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.396988 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.397054 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.397071 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.397094 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.397138 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:18Z","lastTransitionTime":"2026-02-17T00:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.500547 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.500628 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.500654 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.500685 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.500708 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:18Z","lastTransitionTime":"2026-02-17T00:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.603950 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.604005 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.604023 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.604048 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.604068 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:18Z","lastTransitionTime":"2026-02-17T00:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.707241 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.707305 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.707332 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.707362 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.707383 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:18Z","lastTransitionTime":"2026-02-17T00:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.809752 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.809808 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.809823 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.809841 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.809855 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:18Z","lastTransitionTime":"2026-02-17T00:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.884872 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:06:18 crc kubenswrapper[4791]: E0217 00:06:18.885149 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:06:34.885067368 +0000 UTC m=+52.364579935 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.913048 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.913151 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.913175 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.913204 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.913221 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:18Z","lastTransitionTime":"2026-02-17T00:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.986275 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.986380 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.986419 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.986456 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:18 crc kubenswrapper[4791]: E0217 00:06:18.986540 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 00:06:18 crc kubenswrapper[4791]: E0217 00:06:18.986577 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 00:06:18 crc kubenswrapper[4791]: E0217 00:06:18.986589 4791 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:18 crc kubenswrapper[4791]: E0217 00:06:18.986594 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 00:06:18 crc kubenswrapper[4791]: E0217 00:06:18.986604 4791 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 00:06:18 crc kubenswrapper[4791]: E0217 00:06:18.986636 4791 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 00:06:18 crc kubenswrapper[4791]: E0217 00:06:18.986614 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 00:06:18 crc kubenswrapper[4791]: E0217 00:06:18.986715 4791 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:18 crc kubenswrapper[4791]: E0217 00:06:18.986644 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:34.986627543 +0000 UTC m=+52.466140070 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:18 crc kubenswrapper[4791]: E0217 00:06:18.986785 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:34.986761988 +0000 UTC m=+52.466274555 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 00:06:18 crc kubenswrapper[4791]: E0217 00:06:18.986807 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:34.986795589 +0000 UTC m=+52.466308146 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 00:06:18 crc kubenswrapper[4791]: E0217 00:06:18.986826 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:34.986816429 +0000 UTC m=+52.466328996 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.015924 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.015964 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.015975 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.015992 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.016015 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:19Z","lastTransitionTime":"2026-02-17T00:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.118756 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.118827 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.118849 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.118881 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.118904 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:19Z","lastTransitionTime":"2026-02-17T00:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.123424 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.123488 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.123511 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.123535 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.123555 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:19Z","lastTransitionTime":"2026-02-17T00:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:19 crc kubenswrapper[4791]: E0217 00:06:19.145339 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:19Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.150688 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.150758 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.150782 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.150815 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.150837 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:19Z","lastTransitionTime":"2026-02-17T00:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:19 crc kubenswrapper[4791]: E0217 00:06:19.172075 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:19Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.173188 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 10:24:24.324309054 +0000 UTC Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.177983 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.178068 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.178093 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.178159 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.178184 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:19Z","lastTransitionTime":"2026-02-17T00:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:19 crc kubenswrapper[4791]: E0217 00:06:19.207350 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:19Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.212858 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.212910 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.212929 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.212955 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.212973 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:19Z","lastTransitionTime":"2026-02-17T00:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.220560 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.220634 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.220664 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:19 crc kubenswrapper[4791]: E0217 00:06:19.220761 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.220793 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:19 crc kubenswrapper[4791]: E0217 00:06:19.220919 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:19 crc kubenswrapper[4791]: E0217 00:06:19.221046 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:06:19 crc kubenswrapper[4791]: E0217 00:06:19.221182 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:19 crc kubenswrapper[4791]: E0217 00:06:19.239062 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:19Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.245810 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.245846 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.245857 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.245874 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.245887 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:19Z","lastTransitionTime":"2026-02-17T00:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:19 crc kubenswrapper[4791]: E0217 00:06:19.263798 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:19Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:19 crc kubenswrapper[4791]: E0217 00:06:19.263950 4791 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.265844 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.265895 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.265912 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.265935 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.265952 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:19Z","lastTransitionTime":"2026-02-17T00:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.369478 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.369564 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.369587 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.369618 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.369642 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:19Z","lastTransitionTime":"2026-02-17T00:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.391053 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs\") pod \"network-metrics-daemon-6x28n\" (UID: \"1d97cf45-2324-494c-839f-6f264eba3828\") " pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:19 crc kubenswrapper[4791]: E0217 00:06:19.391237 4791 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 00:06:19 crc kubenswrapper[4791]: E0217 00:06:19.391332 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs podName:1d97cf45-2324-494c-839f-6f264eba3828 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:21.391305293 +0000 UTC m=+38.870817850 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs") pod "network-metrics-daemon-6x28n" (UID: "1d97cf45-2324-494c-839f-6f264eba3828") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.472895 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.472969 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.472989 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.473017 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.473036 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:19Z","lastTransitionTime":"2026-02-17T00:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.576623 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.576716 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.576736 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.576758 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.576775 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:19Z","lastTransitionTime":"2026-02-17T00:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.680375 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.680452 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.680475 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.680501 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.680525 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:19Z","lastTransitionTime":"2026-02-17T00:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.784580 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.784641 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.784658 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.784682 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.784700 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:19Z","lastTransitionTime":"2026-02-17T00:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.887460 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.887520 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.887538 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.887562 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.887579 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:19Z","lastTransitionTime":"2026-02-17T00:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.990885 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.990940 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.990957 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.990980 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.990996 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:19Z","lastTransitionTime":"2026-02-17T00:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.094739 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.094812 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.094836 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.094860 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.094884 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:20Z","lastTransitionTime":"2026-02-17T00:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.173596 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 19:26:47.645229442 +0000 UTC Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.198079 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.198161 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.198179 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.198203 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.198221 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:20Z","lastTransitionTime":"2026-02-17T00:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.300733 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.300768 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.300777 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.300792 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.300803 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:20Z","lastTransitionTime":"2026-02-17T00:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.403579 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.403665 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.403689 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.403716 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.403734 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:20Z","lastTransitionTime":"2026-02-17T00:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.506918 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.506976 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.507000 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.507024 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.507045 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:20Z","lastTransitionTime":"2026-02-17T00:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.609783 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.609853 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.609888 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.609925 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.609947 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:20Z","lastTransitionTime":"2026-02-17T00:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.713133 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.713198 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.713222 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.713364 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.713411 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:20Z","lastTransitionTime":"2026-02-17T00:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.815396 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.815444 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.815461 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.815487 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.815504 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:20Z","lastTransitionTime":"2026-02-17T00:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.918674 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.918754 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.918779 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.918802 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.918819 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:20Z","lastTransitionTime":"2026-02-17T00:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.022031 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.022089 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.022147 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.022181 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.022235 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:21Z","lastTransitionTime":"2026-02-17T00:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.124778 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.124805 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.124814 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.124826 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.124834 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:21Z","lastTransitionTime":"2026-02-17T00:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.174074 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 09:00:07.473383185 +0000 UTC Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.219780 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.219873 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.219908 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.219991 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:21 crc kubenswrapper[4791]: E0217 00:06:21.219985 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:21 crc kubenswrapper[4791]: E0217 00:06:21.220158 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:21 crc kubenswrapper[4791]: E0217 00:06:21.220460 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:21 crc kubenswrapper[4791]: E0217 00:06:21.220585 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.227412 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.227467 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.227494 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.227518 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.227536 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:21Z","lastTransitionTime":"2026-02-17T00:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.330538 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.330605 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.330628 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.330663 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.330687 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:21Z","lastTransitionTime":"2026-02-17T00:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.411613 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs\") pod \"network-metrics-daemon-6x28n\" (UID: \"1d97cf45-2324-494c-839f-6f264eba3828\") " pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:21 crc kubenswrapper[4791]: E0217 00:06:21.411776 4791 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 00:06:21 crc kubenswrapper[4791]: E0217 00:06:21.411845 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs podName:1d97cf45-2324-494c-839f-6f264eba3828 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:25.411822393 +0000 UTC m=+42.891334930 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs") pod "network-metrics-daemon-6x28n" (UID: "1d97cf45-2324-494c-839f-6f264eba3828") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.432724 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.432775 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.432790 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.432810 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.432826 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:21Z","lastTransitionTime":"2026-02-17T00:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.535177 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.535220 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.535229 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.535243 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.535253 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:21Z","lastTransitionTime":"2026-02-17T00:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.638211 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.638251 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.638263 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.638279 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.638291 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:21Z","lastTransitionTime":"2026-02-17T00:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.741448 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.741506 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.741526 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.741555 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.741574 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:21Z","lastTransitionTime":"2026-02-17T00:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.844283 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.844348 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.844373 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.844404 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.844427 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:21Z","lastTransitionTime":"2026-02-17T00:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.946883 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.946953 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.946978 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.947006 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.947026 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:21Z","lastTransitionTime":"2026-02-17T00:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.049706 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.049744 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.049752 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.049765 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.049774 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:22Z","lastTransitionTime":"2026-02-17T00:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.153272 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.153336 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.153357 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.153387 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.153405 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:22Z","lastTransitionTime":"2026-02-17T00:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.175150 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 17:14:42.131943143 +0000 UTC Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.256664 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.256725 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.256743 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.256766 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.256787 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:22Z","lastTransitionTime":"2026-02-17T00:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.359424 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.359473 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.359490 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.359512 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.359529 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:22Z","lastTransitionTime":"2026-02-17T00:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.461820 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.461860 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.461877 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.461897 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.461914 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:22Z","lastTransitionTime":"2026-02-17T00:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.564810 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.564868 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.564886 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.564908 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.564925 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:22Z","lastTransitionTime":"2026-02-17T00:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.666567 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.666628 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.666645 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.666670 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.666688 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:22Z","lastTransitionTime":"2026-02-17T00:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.769738 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.769840 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.769863 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.769888 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.769907 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:22Z","lastTransitionTime":"2026-02-17T00:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.874028 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.874143 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.874168 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.874198 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.874220 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:22Z","lastTransitionTime":"2026-02-17T00:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.977169 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.977228 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.977246 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.977270 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.977288 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:22Z","lastTransitionTime":"2026-02-17T00:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.080535 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.080605 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.080625 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.080652 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.080669 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:23Z","lastTransitionTime":"2026-02-17T00:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.175537 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 17:08:04.891064426 +0000 UTC Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.183636 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.183697 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.183719 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.183748 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.183771 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:23Z","lastTransitionTime":"2026-02-17T00:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.219806 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.219891 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.219942 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.219954 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:23 crc kubenswrapper[4791]: E0217 00:06:23.220188 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:23 crc kubenswrapper[4791]: E0217 00:06:23.220477 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:06:23 crc kubenswrapper[4791]: E0217 00:06:23.220627 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:23 crc kubenswrapper[4791]: E0217 00:06:23.220825 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.243584 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.265141 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.286359 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.286419 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.286436 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.286459 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.286475 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:23Z","lastTransitionTime":"2026-02-17T00:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.291056 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.313704 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.339866 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.358096 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.374597 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.389021 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.389141 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.389168 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.389199 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.389223 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:23Z","lastTransitionTime":"2026-02-17T00:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.392784 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.411277 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.427197 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1832e521-1715-432d-917c-bc0ab725e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6482797ad75c032ff3c59d7bc47afd9c215720de8bb76506421ebb686ee29800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f54cf1746cd7a9cc213dadaf7dc256f06b16d9062fba3ddcfc389cb2ff5dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8tdlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.467508 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.485022 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.492039 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.492354 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.492495 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.492634 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.492782 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:23Z","lastTransitionTime":"2026-02-17T00:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.508642 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.528181 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.561958 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0447c886ed6f06f1f58963535c53b2de3e27e1cbb4b9210cd507691736736f40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0447c886ed6f06f1f58963535c53b2de3e27e1cbb4b9210cd507691736736f40\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:14Z\\\",\\\"message\\\":\\\"ndler 5 for removal\\\\nI0217 00:06:14.432585 6188 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 00:06:14.432616 6188 factory.go:656] Stopping watch factory\\\\nI0217 00:06:14.432659 6188 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 00:06:14.432596 6188 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 00:06:14.432725 6188 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 00:06:14.432697 6188 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:14.432953 6188 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:14.433009 6188 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:14.433215 6188 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:14.433466 6188 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.580361 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.596047 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.596135 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.596154 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.596183 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.596202 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:23Z","lastTransitionTime":"2026-02-17T00:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.598023 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6x28n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d97cf45-2324-494c-839f-6f264eba3828\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6x28n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.699162 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.699509 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.699619 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.699734 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.699850 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:23Z","lastTransitionTime":"2026-02-17T00:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.802965 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.803023 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.803045 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.803074 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.803095 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:23Z","lastTransitionTime":"2026-02-17T00:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.906690 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.906810 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.906834 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.906861 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.906888 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:23Z","lastTransitionTime":"2026-02-17T00:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.010307 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.010386 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.010405 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.010429 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.010447 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:24Z","lastTransitionTime":"2026-02-17T00:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.113097 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.113144 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.113153 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.113168 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.113178 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:24Z","lastTransitionTime":"2026-02-17T00:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.176509 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 22:44:06.38731224 +0000 UTC Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.216546 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.216612 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.216631 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.216655 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.216672 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:24Z","lastTransitionTime":"2026-02-17T00:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.322766 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.322847 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.322865 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.322894 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.322913 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:24Z","lastTransitionTime":"2026-02-17T00:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.426965 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.427004 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.427021 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.427046 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.427065 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:24Z","lastTransitionTime":"2026-02-17T00:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.529862 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.529925 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.529949 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.529979 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.530002 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:24Z","lastTransitionTime":"2026-02-17T00:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.632978 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.633031 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.633051 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.633080 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.633135 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:24Z","lastTransitionTime":"2026-02-17T00:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.735951 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.736011 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.736029 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.736059 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.736077 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:24Z","lastTransitionTime":"2026-02-17T00:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.839255 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.839343 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.839365 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.839398 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.839421 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:24Z","lastTransitionTime":"2026-02-17T00:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.942235 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.942290 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.942310 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.942334 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.942350 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:24Z","lastTransitionTime":"2026-02-17T00:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.044800 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.044860 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.044876 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.044899 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.044918 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:25Z","lastTransitionTime":"2026-02-17T00:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.148454 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.148548 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.148571 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.148597 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.148614 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:25Z","lastTransitionTime":"2026-02-17T00:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.177275 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 14:42:04.911961008 +0000 UTC Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.220372 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.220449 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:25 crc kubenswrapper[4791]: E0217 00:06:25.220594 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.220680 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.220718 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:25 crc kubenswrapper[4791]: E0217 00:06:25.220891 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:25 crc kubenswrapper[4791]: E0217 00:06:25.220986 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:25 crc kubenswrapper[4791]: E0217 00:06:25.221090 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.252101 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.252191 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.252209 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.252231 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.252249 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:25Z","lastTransitionTime":"2026-02-17T00:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.355792 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.355867 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.355891 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.355920 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.355944 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:25Z","lastTransitionTime":"2026-02-17T00:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.453069 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs\") pod \"network-metrics-daemon-6x28n\" (UID: \"1d97cf45-2324-494c-839f-6f264eba3828\") " pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:25 crc kubenswrapper[4791]: E0217 00:06:25.453330 4791 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 00:06:25 crc kubenswrapper[4791]: E0217 00:06:25.453452 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs podName:1d97cf45-2324-494c-839f-6f264eba3828 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:33.453421971 +0000 UTC m=+50.932934528 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs") pod "network-metrics-daemon-6x28n" (UID: "1d97cf45-2324-494c-839f-6f264eba3828") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.459691 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.459745 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.459758 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.459777 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.459791 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:25Z","lastTransitionTime":"2026-02-17T00:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.562685 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.562760 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.562783 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.562813 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.562836 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:25Z","lastTransitionTime":"2026-02-17T00:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.665681 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.665728 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.665740 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.665755 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.665767 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:25Z","lastTransitionTime":"2026-02-17T00:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.769035 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.769094 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.769144 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.769171 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.769192 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:25Z","lastTransitionTime":"2026-02-17T00:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.872621 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.872710 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.872737 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.872769 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.872794 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:25Z","lastTransitionTime":"2026-02-17T00:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.976334 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.976394 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.976416 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.976441 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.976462 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:25Z","lastTransitionTime":"2026-02-17T00:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.079758 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.079798 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.079809 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.079824 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.079834 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:26Z","lastTransitionTime":"2026-02-17T00:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.177571 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 22:13:08.525395331 +0000 UTC Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.183383 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.183446 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.183471 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.183500 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.183522 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:26Z","lastTransitionTime":"2026-02-17T00:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.286035 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.286150 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.286176 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.286208 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.286235 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:26Z","lastTransitionTime":"2026-02-17T00:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.389074 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.389187 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.389209 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.389234 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.389253 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:26Z","lastTransitionTime":"2026-02-17T00:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.492805 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.492864 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.492882 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.492906 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.492923 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:26Z","lastTransitionTime":"2026-02-17T00:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.595894 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.595985 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.596010 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.596043 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.596068 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:26Z","lastTransitionTime":"2026-02-17T00:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.699591 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.699680 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.699704 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.699734 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.699767 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:26Z","lastTransitionTime":"2026-02-17T00:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.802747 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.802907 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.802929 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.802953 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.802970 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:26Z","lastTransitionTime":"2026-02-17T00:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.905962 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.906006 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.906022 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.906045 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.906062 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:26Z","lastTransitionTime":"2026-02-17T00:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.010181 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.010239 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.010255 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.010279 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.010296 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:27Z","lastTransitionTime":"2026-02-17T00:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.113297 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.113366 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.113391 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.113421 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.113443 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:27Z","lastTransitionTime":"2026-02-17T00:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.178214 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 20:35:28.399948158 +0000 UTC Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.216008 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.216031 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.216039 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.216051 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.216060 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:27Z","lastTransitionTime":"2026-02-17T00:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.219521 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.219555 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.219647 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:27 crc kubenswrapper[4791]: E0217 00:06:27.219751 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.219837 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:27 crc kubenswrapper[4791]: E0217 00:06:27.219941 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:27 crc kubenswrapper[4791]: E0217 00:06:27.220066 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:06:27 crc kubenswrapper[4791]: E0217 00:06:27.220186 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.318808 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.318907 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.318934 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.318969 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.318993 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:27Z","lastTransitionTime":"2026-02-17T00:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.422463 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.422519 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.422579 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.422606 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.422622 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:27Z","lastTransitionTime":"2026-02-17T00:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.525828 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.525920 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.525937 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.525960 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.525977 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:27Z","lastTransitionTime":"2026-02-17T00:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.628265 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.628301 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.628309 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.628321 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.628332 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:27Z","lastTransitionTime":"2026-02-17T00:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.731453 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.731519 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.731536 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.731560 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.731578 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:27Z","lastTransitionTime":"2026-02-17T00:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.834829 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.834890 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.834908 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.834932 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.834948 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:27Z","lastTransitionTime":"2026-02-17T00:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.937592 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.937709 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.937734 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.937762 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.937784 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:27Z","lastTransitionTime":"2026-02-17T00:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.041193 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.041234 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.041244 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.041258 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.041267 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:28Z","lastTransitionTime":"2026-02-17T00:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.144190 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.144258 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.144277 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.144300 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.144319 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:28Z","lastTransitionTime":"2026-02-17T00:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.179185 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 01:08:52.722765387 +0000 UTC Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.247373 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.247431 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.247458 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.247487 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.247505 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:28Z","lastTransitionTime":"2026-02-17T00:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.350945 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.351007 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.351027 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.351050 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.351069 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:28Z","lastTransitionTime":"2026-02-17T00:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.454469 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.454520 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.454535 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.454556 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.454576 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:28Z","lastTransitionTime":"2026-02-17T00:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.558230 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.558306 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.558331 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.558357 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.558421 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:28Z","lastTransitionTime":"2026-02-17T00:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.661237 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.661309 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.661327 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.661353 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.661373 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:28Z","lastTransitionTime":"2026-02-17T00:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.763840 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.763885 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.763893 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.763919 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.763929 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:28Z","lastTransitionTime":"2026-02-17T00:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.867248 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.867301 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.867310 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.867325 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.867336 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:28Z","lastTransitionTime":"2026-02-17T00:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.970942 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.971009 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.971026 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.971051 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.971070 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:28Z","lastTransitionTime":"2026-02-17T00:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.073920 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.074151 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.074181 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.074214 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.074235 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:29Z","lastTransitionTime":"2026-02-17T00:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.177893 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.177952 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.177971 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.177997 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.178015 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:29Z","lastTransitionTime":"2026-02-17T00:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.180283 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 19:05:19.508331486 +0000 UTC Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.219598 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:29 crc kubenswrapper[4791]: E0217 00:06:29.219773 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.219872 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.219884 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.219926 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:29 crc kubenswrapper[4791]: E0217 00:06:29.220618 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:06:29 crc kubenswrapper[4791]: E0217 00:06:29.220697 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:29 crc kubenswrapper[4791]: E0217 00:06:29.220811 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.220997 4791 scope.go:117] "RemoveContainer" containerID="0447c886ed6f06f1f58963535c53b2de3e27e1cbb4b9210cd507691736736f40" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.283874 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.284231 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.284248 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.284269 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.284286 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:29Z","lastTransitionTime":"2026-02-17T00:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.386974 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.387053 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.387076 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.387142 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.387169 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:29Z","lastTransitionTime":"2026-02-17T00:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.477363 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.477412 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.477430 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.477452 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.477470 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:29Z","lastTransitionTime":"2026-02-17T00:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:29 crc kubenswrapper[4791]: E0217 00:06:29.498902 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.504736 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.504792 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.504816 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.504845 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.504862 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:29Z","lastTransitionTime":"2026-02-17T00:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:29 crc kubenswrapper[4791]: E0217 00:06:29.518695 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.523256 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.523300 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.523312 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.523330 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.523344 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:29Z","lastTransitionTime":"2026-02-17T00:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:29 crc kubenswrapper[4791]: E0217 00:06:29.541338 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.546466 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.546504 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.546515 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.546532 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.546544 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:29Z","lastTransitionTime":"2026-02-17T00:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:29 crc kubenswrapper[4791]: E0217 00:06:29.564428 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.569507 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.569544 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.569556 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.569570 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.569582 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:29Z","lastTransitionTime":"2026-02-17T00:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.574927 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hldzt_e7fe508f-1e8c-4da7-8f99-108e73cb3791/ovnkube-controller/1.log" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.577980 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerStarted","Data":"1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1"} Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.578407 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:29 crc kubenswrapper[4791]: E0217 00:06:29.589723 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:29 crc kubenswrapper[4791]: E0217 00:06:29.590098 4791 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.592673 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.592737 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.592755 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.592780 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.592798 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:29Z","lastTransitionTime":"2026-02-17T00:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.598203 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.621528 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.650507 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.671618 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.696424 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.696491 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.696513 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.696542 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.696584 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:29Z","lastTransitionTime":"2026-02-17T00:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.698905 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.718024 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.734240 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1832e521-1715-432d-917c-bc0ab725e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6482797ad75c032ff3c59d7bc47afd9c215720de8bb76506421ebb686ee29800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f54cf1746cd7a9cc213dadaf7dc256f06b16d9062fba3ddcfc389cb2ff5dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8tdlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.756373 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.774200 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.800137 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.800193 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.800212 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.800237 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.800256 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:29Z","lastTransitionTime":"2026-02-17T00:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.801587 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.828711 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.856729 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0447c886ed6f06f1f58963535c53b2de3e27e1cbb4b9210cd507691736736f40\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:14Z\\\",\\\"message\\\":\\\"ndler 5 for removal\\\\nI0217 00:06:14.432585 6188 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 00:06:14.432616 6188 factory.go:656] Stopping watch factory\\\\nI0217 00:06:14.432659 6188 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 00:06:14.432596 6188 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 00:06:14.432725 6188 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 00:06:14.432697 6188 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:14.432953 6188 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:14.433009 6188 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:14.433215 6188 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:14.433466 6188 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.870906 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.886392 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6x28n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d97cf45-2324-494c-839f-6f264eba3828\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6x28n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.902277 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.902311 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.902320 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.902333 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.902342 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:29Z","lastTransitionTime":"2026-02-17T00:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.911514 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.920341 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.933726 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.004957 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.004991 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.005002 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.005017 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.005027 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:30Z","lastTransitionTime":"2026-02-17T00:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.107021 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.107067 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.107075 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.107089 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.107097 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:30Z","lastTransitionTime":"2026-02-17T00:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.180853 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 13:37:05.289376691 +0000 UTC Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.209864 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.209923 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.209941 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.209966 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.209983 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:30Z","lastTransitionTime":"2026-02-17T00:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.312459 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.312493 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.312504 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.312520 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.312532 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:30Z","lastTransitionTime":"2026-02-17T00:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.414898 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.414961 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.414979 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.415004 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.415022 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:30Z","lastTransitionTime":"2026-02-17T00:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.518670 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.518734 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.518754 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.518777 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.518795 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:30Z","lastTransitionTime":"2026-02-17T00:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.585839 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hldzt_e7fe508f-1e8c-4da7-8f99-108e73cb3791/ovnkube-controller/2.log" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.586877 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hldzt_e7fe508f-1e8c-4da7-8f99-108e73cb3791/ovnkube-controller/1.log" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.591641 4791 generic.go:334] "Generic (PLEG): container finished" podID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerID="1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1" exitCode=1 Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.591709 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerDied","Data":"1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1"} Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.591768 4791 scope.go:117] "RemoveContainer" containerID="0447c886ed6f06f1f58963535c53b2de3e27e1cbb4b9210cd507691736736f40" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.592907 4791 scope.go:117] "RemoveContainer" containerID="1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1" Feb 17 00:06:30 crc kubenswrapper[4791]: E0217 00:06:30.593196 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.607784 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.621928 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6x28n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d97cf45-2324-494c-839f-6f264eba3828\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6x28n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.623480 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.623557 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.623583 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.623622 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.623649 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:30Z","lastTransitionTime":"2026-02-17T00:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.648656 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.661596 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.704279 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.726590 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.726667 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.726692 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.726716 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.726731 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:30Z","lastTransitionTime":"2026-02-17T00:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.734099 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.755579 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0447c886ed6f06f1f58963535c53b2de3e27e1cbb4b9210cd507691736736f40\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:14Z\\\",\\\"message\\\":\\\"ndler 5 for removal\\\\nI0217 00:06:14.432585 6188 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 00:06:14.432616 6188 factory.go:656] Stopping watch factory\\\\nI0217 00:06:14.432659 6188 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 00:06:14.432596 6188 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 00:06:14.432725 6188 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 00:06:14.432697 6188 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:14.432953 6188 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:14.433009 6188 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:14.433215 6188 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:14.433466 6188 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:30Z\\\",\\\"message\\\":\\\"0] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 00:06:30.321001 6400 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:30.321058 6400 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 00:06:30.321081 6400 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 00:06:30.321088 6400 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0217 00:06:30.321144 6400 factory.go:656] Stopping watch factory\\\\nI0217 00:06:30.321180 6400 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 00:06:30.321189 6400 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 00:06:30.321197 6400 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 00:06:30.321204 6400 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 00:06:30.321211 6400 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 00:06:30.321226 6400 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 00:06:30.321225 6400 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:30.321498 6400 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.771482 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.788744 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.806964 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.824479 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.829612 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.829692 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.829718 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.829751 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.829778 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:30Z","lastTransitionTime":"2026-02-17T00:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.841549 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1832e521-1715-432d-917c-bc0ab725e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6482797ad75c032ff3c59d7bc47afd9c215720de8bb76506421ebb686ee29800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f54cf1746cd7a9cc213dadaf7dc256f06b16d9062fba3ddcfc389cb2ff5dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8tdlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.864173 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.885865 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.904805 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.924306 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.932583 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.933229 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.933568 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.933744 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.933926 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:30Z","lastTransitionTime":"2026-02-17T00:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.943818 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.037206 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.037629 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.037694 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.037768 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.037848 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:31Z","lastTransitionTime":"2026-02-17T00:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.140237 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.140282 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.140297 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.140316 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.140329 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:31Z","lastTransitionTime":"2026-02-17T00:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.181205 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 18:38:58.719247442 +0000 UTC Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.219388 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.219498 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.219517 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.219625 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:31 crc kubenswrapper[4791]: E0217 00:06:31.219619 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:31 crc kubenswrapper[4791]: E0217 00:06:31.219761 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:31 crc kubenswrapper[4791]: E0217 00:06:31.219941 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:06:31 crc kubenswrapper[4791]: E0217 00:06:31.220176 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.244518 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.244592 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.244615 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.244641 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.244660 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:31Z","lastTransitionTime":"2026-02-17T00:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.347556 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.347613 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.347627 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.347644 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.347657 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:31Z","lastTransitionTime":"2026-02-17T00:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.450760 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.450809 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.450819 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.450835 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.450846 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:31Z","lastTransitionTime":"2026-02-17T00:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.554454 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.554528 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.554546 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.554573 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.554590 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:31Z","lastTransitionTime":"2026-02-17T00:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.597249 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hldzt_e7fe508f-1e8c-4da7-8f99-108e73cb3791/ovnkube-controller/2.log" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.600190 4791 scope.go:117] "RemoveContainer" containerID="1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1" Feb 17 00:06:31 crc kubenswrapper[4791]: E0217 00:06:31.600346 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.642681 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:31Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.656463 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.656514 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.656524 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.656540 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.656552 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:31Z","lastTransitionTime":"2026-02-17T00:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.658095 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:31Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.668203 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:31Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.678000 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:31Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.688706 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:31Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.697358 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:31Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.705936 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1832e521-1715-432d-917c-bc0ab725e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6482797ad75c032ff3c59d7bc47afd9c215720de8bb76506421ebb686ee29800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f54cf1746cd7a9cc213dadaf7dc256f06b16d9062fba3ddcfc389cb2ff5dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8tdlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:31Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.717140 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:31Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.727542 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:31Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.738699 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:31Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.754189 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:31Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.758989 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.759023 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.759034 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.759048 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.759062 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:31Z","lastTransitionTime":"2026-02-17T00:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.776178 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:30Z\\\",\\\"message\\\":\\\"0] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 00:06:30.321001 6400 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:30.321058 6400 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 00:06:30.321081 6400 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 00:06:30.321088 6400 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0217 00:06:30.321144 6400 factory.go:656] Stopping watch factory\\\\nI0217 00:06:30.321180 6400 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 00:06:30.321189 6400 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 00:06:30.321197 6400 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 00:06:30.321204 6400 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 00:06:30.321211 6400 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 00:06:30.321226 6400 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 00:06:30.321225 6400 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:30.321498 6400 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:31Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.786288 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:31Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.802206 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6x28n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d97cf45-2324-494c-839f-6f264eba3828\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6x28n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:31Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.834000 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:31Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.848503 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:31Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.860404 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:31Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.861702 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.861768 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.861788 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.861812 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.861829 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:31Z","lastTransitionTime":"2026-02-17T00:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.964597 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.964650 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.964664 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.964681 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.964697 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:31Z","lastTransitionTime":"2026-02-17T00:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.067701 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.067776 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.067799 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.067829 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.067852 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:32Z","lastTransitionTime":"2026-02-17T00:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.171617 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.171690 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.171716 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.171745 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.171767 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:32Z","lastTransitionTime":"2026-02-17T00:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.182432 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 13:23:34.422248985 +0000 UTC Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.275090 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.275179 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.275202 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.275233 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.275257 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:32Z","lastTransitionTime":"2026-02-17T00:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.377484 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.377559 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.377579 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.377606 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.377625 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:32Z","lastTransitionTime":"2026-02-17T00:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.480020 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.480086 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.480104 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.480168 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.480186 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:32Z","lastTransitionTime":"2026-02-17T00:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.583798 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.583861 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.583883 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.583911 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.583931 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:32Z","lastTransitionTime":"2026-02-17T00:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.686842 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.686901 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.686921 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.686945 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.686967 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:32Z","lastTransitionTime":"2026-02-17T00:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.789287 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.789354 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.789372 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.789398 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.789416 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:32Z","lastTransitionTime":"2026-02-17T00:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.893003 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.893073 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.893097 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.893156 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.893181 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:32Z","lastTransitionTime":"2026-02-17T00:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.000466 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.000605 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.000632 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.000662 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.000682 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:33Z","lastTransitionTime":"2026-02-17T00:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.095396 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.103734 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.103841 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.103992 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.104023 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.104046 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:33Z","lastTransitionTime":"2026-02-17T00:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.108255 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.132050 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.148212 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.170475 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.183191 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 10:08:28.943175902 +0000 UTC Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.190680 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.207568 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.207622 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.207645 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.207669 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.207687 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:33Z","lastTransitionTime":"2026-02-17T00:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.220427 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.220527 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:33 crc kubenswrapper[4791]: E0217 00:06:33.220660 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.220698 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.220806 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:33 crc kubenswrapper[4791]: E0217 00:06:33.220947 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:33 crc kubenswrapper[4791]: E0217 00:06:33.221055 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:06:33 crc kubenswrapper[4791]: E0217 00:06:33.221198 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.224094 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:30Z\\\",\\\"message\\\":\\\"0] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 00:06:30.321001 6400 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:30.321058 6400 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 00:06:30.321081 6400 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 00:06:30.321088 6400 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0217 00:06:30.321144 6400 factory.go:656] Stopping watch factory\\\\nI0217 00:06:30.321180 6400 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 00:06:30.321189 6400 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 00:06:30.321197 6400 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 00:06:30.321204 6400 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 00:06:30.321211 6400 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 00:06:30.321226 6400 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 00:06:30.321225 6400 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:30.321498 6400 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.240069 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.255759 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6x28n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d97cf45-2324-494c-839f-6f264eba3828\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6x28n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.275601 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.294286 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.310747 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.310824 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.310842 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.310869 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.310888 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:33Z","lastTransitionTime":"2026-02-17T00:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.318848 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.344959 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.365832 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.384166 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.403272 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.414246 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.414288 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.414301 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.414318 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.414330 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:33Z","lastTransitionTime":"2026-02-17T00:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.422703 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.439215 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.455737 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1832e521-1715-432d-917c-bc0ab725e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6482797ad75c032ff3c59d7bc47afd9c215720de8bb76506421ebb686ee29800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f54cf1746cd7a9cc213dadaf7dc256f06b16d9062fba3ddcfc389cb2ff5dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8tdlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.472275 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1832e521-1715-432d-917c-bc0ab725e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6482797ad75c032ff3c59d7bc47afd9c215720de8bb76506421ebb686ee29800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f54cf1746cd7a9cc213dadaf7dc256f06b16d9062fba3ddcfc389cb2ff5dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8tdlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.493300 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.513598 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.516880 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.516924 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.516942 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.516966 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.516983 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:33Z","lastTransitionTime":"2026-02-17T00:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.532602 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.540174 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs\") pod \"network-metrics-daemon-6x28n\" (UID: \"1d97cf45-2324-494c-839f-6f264eba3828\") " pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:33 crc kubenswrapper[4791]: E0217 00:06:33.540339 4791 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 00:06:33 crc kubenswrapper[4791]: E0217 00:06:33.540435 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs podName:1d97cf45-2324-494c-839f-6f264eba3828 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:49.540408103 +0000 UTC m=+67.019920660 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs") pod "network-metrics-daemon-6x28n" (UID: "1d97cf45-2324-494c-839f-6f264eba3828") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.550991 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.574530 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.591783 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.608531 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6x28n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d97cf45-2324-494c-839f-6f264eba3828\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6x28n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.619917 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.619977 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.619999 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.620024 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.620043 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:33Z","lastTransitionTime":"2026-02-17T00:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.638868 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.654314 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.674154 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.693061 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.723267 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.723362 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.723378 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.723402 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.723421 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:33Z","lastTransitionTime":"2026-02-17T00:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.725154 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:30Z\\\",\\\"message\\\":\\\"0] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 00:06:30.321001 6400 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:30.321058 6400 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 00:06:30.321081 6400 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 00:06:30.321088 6400 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0217 00:06:30.321144 6400 factory.go:656] Stopping watch factory\\\\nI0217 00:06:30.321180 6400 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 00:06:30.321189 6400 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 00:06:30.321197 6400 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 00:06:30.321204 6400 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 00:06:30.321211 6400 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 00:06:30.321226 6400 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 00:06:30.321225 6400 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:30.321498 6400 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.742641 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.765185 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"307ada4d-5f23-4c2f-b915-110f48b354f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0271da50c7247334982fae6e28a6f872507b254be16d44d16caa4dd8803c55d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5862918ee8e19c933c26da9af8cabcfb3a381255459915f89d35f97b7dcb646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://273489c73541499e670244ce9e3deb5f726a4b53b67e1d2ff4157c78a5890846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e521383a6bc1b3d1508d862e6d950187345c86d8d464472a626463244acd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e521383a6bc1b3d1508d862e6d950187345c86d8d464472a626463244acd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.788394 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.808582 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.826619 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.826691 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.826718 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.826750 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.826777 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:33Z","lastTransitionTime":"2026-02-17T00:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.833773 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.929605 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.929674 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.929690 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.929716 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.929736 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:33Z","lastTransitionTime":"2026-02-17T00:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.032895 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.033015 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.033044 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.033077 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.033100 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:34Z","lastTransitionTime":"2026-02-17T00:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.136766 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.136825 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.136843 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.136866 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.136883 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:34Z","lastTransitionTime":"2026-02-17T00:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.184374 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 00:48:00.836081502 +0000 UTC Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.240329 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.240386 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.240403 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.240428 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.240446 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:34Z","lastTransitionTime":"2026-02-17T00:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.343418 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.343498 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.343522 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.343553 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.343578 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:34Z","lastTransitionTime":"2026-02-17T00:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.446274 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.446311 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.446323 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.446340 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.446354 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:34Z","lastTransitionTime":"2026-02-17T00:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.548877 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.548944 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.548971 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.549001 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.549037 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:34Z","lastTransitionTime":"2026-02-17T00:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.652064 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.652147 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.652167 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.652191 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.652209 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:34Z","lastTransitionTime":"2026-02-17T00:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.755663 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.755895 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.755913 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.755938 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.755955 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:34Z","lastTransitionTime":"2026-02-17T00:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.858499 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.858549 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.858565 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.858588 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.858605 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:34Z","lastTransitionTime":"2026-02-17T00:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.956954 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:06:34 crc kubenswrapper[4791]: E0217 00:06:34.957198 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:07:06.957158339 +0000 UTC m=+84.436670906 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.961000 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.961058 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.961076 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.961101 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.961147 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:34Z","lastTransitionTime":"2026-02-17T00:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.058244 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.058319 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.058366 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.058440 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:35 crc kubenswrapper[4791]: E0217 00:06:35.058454 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 00:06:35 crc kubenswrapper[4791]: E0217 00:06:35.058496 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 00:06:35 crc kubenswrapper[4791]: E0217 00:06:35.058493 4791 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 00:06:35 crc kubenswrapper[4791]: E0217 00:06:35.058516 4791 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:35 crc kubenswrapper[4791]: E0217 00:06:35.058589 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 00:07:07.058560979 +0000 UTC m=+84.538073536 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 00:06:35 crc kubenswrapper[4791]: E0217 00:06:35.058615 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 00:07:07.058603481 +0000 UTC m=+84.538116048 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:35 crc kubenswrapper[4791]: E0217 00:06:35.058680 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 00:06:35 crc kubenswrapper[4791]: E0217 00:06:35.058699 4791 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 00:06:35 crc kubenswrapper[4791]: E0217 00:06:35.058825 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 00:07:07.058796677 +0000 UTC m=+84.538309244 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 00:06:35 crc kubenswrapper[4791]: E0217 00:06:35.058717 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 00:06:35 crc kubenswrapper[4791]: E0217 00:06:35.058872 4791 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:35 crc kubenswrapper[4791]: E0217 00:06:35.058942 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 00:07:07.058926981 +0000 UTC m=+84.538439548 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.064608 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.064655 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.064672 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.064701 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.064721 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:35Z","lastTransitionTime":"2026-02-17T00:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.167532 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.167600 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.167617 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.167643 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.167664 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:35Z","lastTransitionTime":"2026-02-17T00:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.185049 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 02:14:40.315048597 +0000 UTC Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.219805 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.219842 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.219911 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:35 crc kubenswrapper[4791]: E0217 00:06:35.220079 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.220140 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:35 crc kubenswrapper[4791]: E0217 00:06:35.220263 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:35 crc kubenswrapper[4791]: E0217 00:06:35.220366 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:06:35 crc kubenswrapper[4791]: E0217 00:06:35.220657 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.274535 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.274984 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.275318 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.275536 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.275760 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:35Z","lastTransitionTime":"2026-02-17T00:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.379830 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.379901 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.379922 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.379951 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.379974 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:35Z","lastTransitionTime":"2026-02-17T00:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.483740 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.483815 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.483837 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.483862 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.483879 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:35Z","lastTransitionTime":"2026-02-17T00:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.586667 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.586946 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.587073 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.587236 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.587390 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:35Z","lastTransitionTime":"2026-02-17T00:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.691017 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.691085 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.691104 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.691159 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.691178 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:35Z","lastTransitionTime":"2026-02-17T00:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.794428 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.794487 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.794505 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.794531 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.794550 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:35Z","lastTransitionTime":"2026-02-17T00:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.898751 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.898835 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.898858 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.898888 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.898911 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:35Z","lastTransitionTime":"2026-02-17T00:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.002326 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.002379 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.002397 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.002420 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.002437 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:36Z","lastTransitionTime":"2026-02-17T00:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.106573 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.106631 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.106649 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.106672 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.106695 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:36Z","lastTransitionTime":"2026-02-17T00:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.185638 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 11:57:31.235483287 +0000 UTC Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.210452 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.210505 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.210522 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.210547 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.210565 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:36Z","lastTransitionTime":"2026-02-17T00:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.314785 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.314852 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.314871 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.314896 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.314913 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:36Z","lastTransitionTime":"2026-02-17T00:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.417102 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.417178 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.417193 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.417211 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.417222 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:36Z","lastTransitionTime":"2026-02-17T00:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.519940 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.520020 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.520044 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.520074 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.520096 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:36Z","lastTransitionTime":"2026-02-17T00:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.622353 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.622413 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.622434 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.622458 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.622476 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:36Z","lastTransitionTime":"2026-02-17T00:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.725890 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.725959 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.725976 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.726002 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.726015 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:36Z","lastTransitionTime":"2026-02-17T00:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.828085 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.828165 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.828179 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.828197 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.828209 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:36Z","lastTransitionTime":"2026-02-17T00:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.931168 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.931222 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.931233 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.931249 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.931262 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:36Z","lastTransitionTime":"2026-02-17T00:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.035795 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.035868 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.035895 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.035924 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.035947 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:37Z","lastTransitionTime":"2026-02-17T00:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.139550 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.139627 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.139651 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.139680 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.139702 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:37Z","lastTransitionTime":"2026-02-17T00:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.186286 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 13:01:18.783157753 +0000 UTC Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.220205 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.220205 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:37 crc kubenswrapper[4791]: E0217 00:06:37.220449 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.220502 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.220257 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:37 crc kubenswrapper[4791]: E0217 00:06:37.220578 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:37 crc kubenswrapper[4791]: E0217 00:06:37.220706 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:37 crc kubenswrapper[4791]: E0217 00:06:37.220852 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.242579 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.242663 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.242712 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.242733 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.242750 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:37Z","lastTransitionTime":"2026-02-17T00:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.346161 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.346253 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.346282 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.346309 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.346327 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:37Z","lastTransitionTime":"2026-02-17T00:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.449156 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.449233 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.449260 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.449290 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.449313 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:37Z","lastTransitionTime":"2026-02-17T00:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.552856 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.552950 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.552977 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.553012 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.553037 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:37Z","lastTransitionTime":"2026-02-17T00:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.655888 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.655944 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.655961 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.655983 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.656000 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:37Z","lastTransitionTime":"2026-02-17T00:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.761981 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.762145 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.762168 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.762184 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.762196 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:37Z","lastTransitionTime":"2026-02-17T00:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.864822 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.864876 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.864895 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.864917 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.864933 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:37Z","lastTransitionTime":"2026-02-17T00:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.967532 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.967616 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.967646 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.967675 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.967695 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:37Z","lastTransitionTime":"2026-02-17T00:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.069937 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.069994 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.070009 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.070033 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.070050 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:38Z","lastTransitionTime":"2026-02-17T00:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.173457 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.173525 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.173542 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.173568 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.173588 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:38Z","lastTransitionTime":"2026-02-17T00:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.187322 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 20:11:42.03298054 +0000 UTC Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.276707 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.276762 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.276779 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.276801 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.276817 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:38Z","lastTransitionTime":"2026-02-17T00:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.380509 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.380565 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.380583 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.380605 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.380623 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:38Z","lastTransitionTime":"2026-02-17T00:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.483433 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.483505 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.483541 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.483574 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.483595 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:38Z","lastTransitionTime":"2026-02-17T00:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.585782 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.585968 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.586027 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.586053 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.586070 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:38Z","lastTransitionTime":"2026-02-17T00:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.689336 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.689411 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.689429 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.689454 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.689471 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:38Z","lastTransitionTime":"2026-02-17T00:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.793509 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.793553 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.793570 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.793594 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.793611 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:38Z","lastTransitionTime":"2026-02-17T00:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.896194 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.896264 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.896282 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.896306 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.896323 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:38Z","lastTransitionTime":"2026-02-17T00:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.998714 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.998765 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.998780 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.998798 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.998817 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:38Z","lastTransitionTime":"2026-02-17T00:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.100938 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.101007 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.101024 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.101041 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.101053 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:39Z","lastTransitionTime":"2026-02-17T00:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.188176 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 14:53:39.640429289 +0000 UTC Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.204064 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.204226 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.204253 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.204286 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.204309 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:39Z","lastTransitionTime":"2026-02-17T00:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.219625 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.219647 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.219749 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.219792 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:39 crc kubenswrapper[4791]: E0217 00:06:39.220019 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:39 crc kubenswrapper[4791]: E0217 00:06:39.220851 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:06:39 crc kubenswrapper[4791]: E0217 00:06:39.220994 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:39 crc kubenswrapper[4791]: E0217 00:06:39.220565 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.306877 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.306937 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.306953 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.306979 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.306998 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:39Z","lastTransitionTime":"2026-02-17T00:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.410007 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.410232 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.410315 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.410404 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.410435 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:39Z","lastTransitionTime":"2026-02-17T00:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.513828 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.513912 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.513931 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.513953 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.514002 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:39Z","lastTransitionTime":"2026-02-17T00:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.617816 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.617928 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.617979 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.618004 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.618048 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:39Z","lastTransitionTime":"2026-02-17T00:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.721451 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.721516 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.721539 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.721565 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.721585 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:39Z","lastTransitionTime":"2026-02-17T00:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.824932 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.825014 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.825037 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.825068 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.825091 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:39Z","lastTransitionTime":"2026-02-17T00:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.853987 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.854051 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.854069 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.854093 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.854155 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:39Z","lastTransitionTime":"2026-02-17T00:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:39 crc kubenswrapper[4791]: E0217 00:06:39.871340 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:39Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.876984 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.877035 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.877052 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.877076 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.877093 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:39Z","lastTransitionTime":"2026-02-17T00:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:39 crc kubenswrapper[4791]: E0217 00:06:39.906043 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:39Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.912710 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.912765 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.912784 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.912808 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.912826 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:39Z","lastTransitionTime":"2026-02-17T00:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:39 crc kubenswrapper[4791]: E0217 00:06:39.936982 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:39Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.942348 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.942413 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.942426 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.942452 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.942468 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:39Z","lastTransitionTime":"2026-02-17T00:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:39 crc kubenswrapper[4791]: E0217 00:06:39.965624 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:39Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.970184 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.970242 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.970262 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.970287 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.970308 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:39Z","lastTransitionTime":"2026-02-17T00:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:39 crc kubenswrapper[4791]: E0217 00:06:39.991920 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:39Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:39 crc kubenswrapper[4791]: E0217 00:06:39.992167 4791 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.994375 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.994451 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.994478 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.994511 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.994536 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:39Z","lastTransitionTime":"2026-02-17T00:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.097743 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.097799 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.097819 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.097845 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.097863 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:40Z","lastTransitionTime":"2026-02-17T00:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.188967 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 01:02:42.630809025 +0000 UTC Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.200662 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.200725 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.200744 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.200767 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.200785 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:40Z","lastTransitionTime":"2026-02-17T00:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.304058 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.304156 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.304176 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.304208 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.304229 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:40Z","lastTransitionTime":"2026-02-17T00:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.407188 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.407261 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.407285 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.407316 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.407339 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:40Z","lastTransitionTime":"2026-02-17T00:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.510918 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.510986 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.511010 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.511046 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.511067 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:40Z","lastTransitionTime":"2026-02-17T00:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.614972 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.615036 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.615053 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.615078 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.615095 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:40Z","lastTransitionTime":"2026-02-17T00:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.719103 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.719212 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.719250 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.719283 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.719307 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:40Z","lastTransitionTime":"2026-02-17T00:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.822632 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.822762 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.822792 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.822824 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.822847 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:40Z","lastTransitionTime":"2026-02-17T00:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.926497 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.926576 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.926601 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.926636 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.926660 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:40Z","lastTransitionTime":"2026-02-17T00:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.030244 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.030322 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.030342 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.030375 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.030399 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:41Z","lastTransitionTime":"2026-02-17T00:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.134158 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.134208 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.134219 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.134238 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.134251 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:41Z","lastTransitionTime":"2026-02-17T00:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.189905 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 01:30:41.186683997 +0000 UTC Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.219523 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.219582 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:41 crc kubenswrapper[4791]: E0217 00:06:41.219728 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.219764 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.219864 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:41 crc kubenswrapper[4791]: E0217 00:06:41.220012 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:06:41 crc kubenswrapper[4791]: E0217 00:06:41.220182 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:41 crc kubenswrapper[4791]: E0217 00:06:41.220350 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.236616 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.236666 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.236683 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.236706 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.236729 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:41Z","lastTransitionTime":"2026-02-17T00:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.340058 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.340134 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.340153 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.340176 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.340194 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:41Z","lastTransitionTime":"2026-02-17T00:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.443327 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.443377 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.443396 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.443420 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.443437 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:41Z","lastTransitionTime":"2026-02-17T00:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.546000 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.546062 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.546079 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.546143 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.546162 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:41Z","lastTransitionTime":"2026-02-17T00:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.684598 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.684665 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.684680 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.684700 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.684716 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:41Z","lastTransitionTime":"2026-02-17T00:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.787576 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.787642 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.787659 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.787687 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.787705 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:41Z","lastTransitionTime":"2026-02-17T00:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.894198 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.894304 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.894324 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.894349 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.894366 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:41Z","lastTransitionTime":"2026-02-17T00:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.997914 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.997982 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.997998 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.998023 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.998040 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:41Z","lastTransitionTime":"2026-02-17T00:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.100915 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.100945 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.100954 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.100970 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.100980 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:42Z","lastTransitionTime":"2026-02-17T00:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.190479 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 17:33:39.799139837 +0000 UTC Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.204176 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.204252 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.204302 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.204334 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.204358 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:42Z","lastTransitionTime":"2026-02-17T00:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.306908 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.306981 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.307007 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.307037 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.307062 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:42Z","lastTransitionTime":"2026-02-17T00:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.410516 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.410598 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.410628 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.410660 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.410684 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:42Z","lastTransitionTime":"2026-02-17T00:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.513646 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.513704 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.513721 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.513747 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.513764 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:42Z","lastTransitionTime":"2026-02-17T00:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.617593 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.617687 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.617714 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.617748 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.617773 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:42Z","lastTransitionTime":"2026-02-17T00:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.720808 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.720865 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.720882 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.720905 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.720922 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:42Z","lastTransitionTime":"2026-02-17T00:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.823692 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.823757 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.823774 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.823799 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.823817 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:42Z","lastTransitionTime":"2026-02-17T00:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.926325 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.926395 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.926414 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.926444 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.926463 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:42Z","lastTransitionTime":"2026-02-17T00:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.029268 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.029320 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.029336 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.029361 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.029378 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:43Z","lastTransitionTime":"2026-02-17T00:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.132443 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.132490 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.132507 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.132530 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.132548 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:43Z","lastTransitionTime":"2026-02-17T00:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.190915 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 23:06:26.454486286 +0000 UTC Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.219461 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.219512 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.219480 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.219624 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:43 crc kubenswrapper[4791]: E0217 00:06:43.219817 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:06:43 crc kubenswrapper[4791]: E0217 00:06:43.222702 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:43 crc kubenswrapper[4791]: E0217 00:06:43.224281 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:43 crc kubenswrapper[4791]: E0217 00:06:43.224370 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.225511 4791 scope.go:117] "RemoveContainer" containerID="1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1" Feb 17 00:06:43 crc kubenswrapper[4791]: E0217 00:06:43.225848 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.235785 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.235839 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.235858 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.235883 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.235901 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:43Z","lastTransitionTime":"2026-02-17T00:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.242174 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"307ada4d-5f23-4c2f-b915-110f48b354f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0271da50c7247334982fae6e28a6f872507b254be16d44d16caa4dd8803c55d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5862918ee8e19c933c26da9af8cabcfb3a381255459915f89d35f97b7dcb646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://273489c73541499e670244ce9e3deb5f726a4b53b67e1d2ff4157c78a5890846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e521383a6bc1b3d1508d862e6d950187345c86d8d464472a626463244acd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e521383a6bc1b3d1508d862e6d950187345c86d8d464472a626463244acd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.265434 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.284192 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.308724 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.330449 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.339892 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.339934 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.339951 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.339977 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.339995 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:43Z","lastTransitionTime":"2026-02-17T00:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.354313 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.368895 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.385391 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.406134 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.420191 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.434282 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1832e521-1715-432d-917c-bc0ab725e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6482797ad75c032ff3c59d7bc47afd9c215720de8bb76506421ebb686ee29800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f54cf1746cd7a9cc213dadaf7dc256f06b16d9062fba3ddcfc389cb2ff5dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8tdlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.442793 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.442845 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.442868 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.442900 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.442922 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:43Z","lastTransitionTime":"2026-02-17T00:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.467475 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.483618 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.504574 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.523763 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.546003 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.546050 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.546066 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.546090 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.546193 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:43Z","lastTransitionTime":"2026-02-17T00:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.555096 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:30Z\\\",\\\"message\\\":\\\"0] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 00:06:30.321001 6400 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:30.321058 6400 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 00:06:30.321081 6400 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 00:06:30.321088 6400 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0217 00:06:30.321144 6400 factory.go:656] Stopping watch factory\\\\nI0217 00:06:30.321180 6400 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 00:06:30.321189 6400 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 00:06:30.321197 6400 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 00:06:30.321204 6400 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 00:06:30.321211 6400 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 00:06:30.321226 6400 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 00:06:30.321225 6400 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:30.321498 6400 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.570919 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.586843 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6x28n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d97cf45-2324-494c-839f-6f264eba3828\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6x28n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.649073 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.649143 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.649159 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.649191 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.649207 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:43Z","lastTransitionTime":"2026-02-17T00:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.752764 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.752842 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.752865 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.752897 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.752922 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:43Z","lastTransitionTime":"2026-02-17T00:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.856358 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.856414 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.856432 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.856457 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.856474 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:43Z","lastTransitionTime":"2026-02-17T00:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.959196 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.959257 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.959280 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.959308 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.959328 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:43Z","lastTransitionTime":"2026-02-17T00:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.062092 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.062186 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.062211 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.062240 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.062260 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:44Z","lastTransitionTime":"2026-02-17T00:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.165246 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.165306 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.165330 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.165358 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.165380 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:44Z","lastTransitionTime":"2026-02-17T00:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.192011 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 09:21:50.657359513 +0000 UTC Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.268018 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.268081 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.268103 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.268174 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.268196 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:44Z","lastTransitionTime":"2026-02-17T00:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.370723 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.370797 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.370822 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.370852 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.370874 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:44Z","lastTransitionTime":"2026-02-17T00:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.474177 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.474247 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.474275 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.474304 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.474324 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:44Z","lastTransitionTime":"2026-02-17T00:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.577531 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.577588 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.577605 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.577629 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.577647 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:44Z","lastTransitionTime":"2026-02-17T00:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.680614 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.680668 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.680687 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.680711 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.680729 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:44Z","lastTransitionTime":"2026-02-17T00:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.783895 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.783961 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.783979 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.784004 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.784024 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:44Z","lastTransitionTime":"2026-02-17T00:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.887175 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.887232 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.887249 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.887271 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.887288 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:44Z","lastTransitionTime":"2026-02-17T00:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.989800 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.989863 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.989880 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.989905 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.989925 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:44Z","lastTransitionTime":"2026-02-17T00:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.093443 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.093498 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.093515 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.093539 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.093555 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:45Z","lastTransitionTime":"2026-02-17T00:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.192907 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 14:38:29.412049599 +0000 UTC Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.196498 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.196569 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.196596 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.196627 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.196647 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:45Z","lastTransitionTime":"2026-02-17T00:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.220141 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.220221 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.220247 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:45 crc kubenswrapper[4791]: E0217 00:06:45.220403 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.220493 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:45 crc kubenswrapper[4791]: E0217 00:06:45.220680 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:06:45 crc kubenswrapper[4791]: E0217 00:06:45.220774 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:45 crc kubenswrapper[4791]: E0217 00:06:45.220883 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.300277 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.300327 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.300354 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.300381 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.300401 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:45Z","lastTransitionTime":"2026-02-17T00:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.402569 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.402620 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.402637 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.402658 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.402676 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:45Z","lastTransitionTime":"2026-02-17T00:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.505528 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.505591 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.505611 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.505635 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.505652 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:45Z","lastTransitionTime":"2026-02-17T00:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.608537 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.608597 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.608618 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.608642 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.608660 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:45Z","lastTransitionTime":"2026-02-17T00:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.710855 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.710929 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.710952 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.710982 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.711000 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:45Z","lastTransitionTime":"2026-02-17T00:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.813402 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.813462 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.813495 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.813532 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.813556 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:45Z","lastTransitionTime":"2026-02-17T00:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.915935 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.916021 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.916050 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.916088 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.916143 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:45Z","lastTransitionTime":"2026-02-17T00:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.018468 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.018528 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.018547 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.018572 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.018590 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:46Z","lastTransitionTime":"2026-02-17T00:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.120441 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.120492 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.120508 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.120525 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.120539 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:46Z","lastTransitionTime":"2026-02-17T00:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.193229 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 06:48:58.722898515 +0000 UTC Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.229451 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.229554 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.229577 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.230072 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.230354 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:46Z","lastTransitionTime":"2026-02-17T00:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.333978 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.334029 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.334046 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.334067 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.335202 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:46Z","lastTransitionTime":"2026-02-17T00:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.438318 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.438394 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.438419 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.438452 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.438477 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:46Z","lastTransitionTime":"2026-02-17T00:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.541971 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.542028 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.542046 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.542070 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.542087 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:46Z","lastTransitionTime":"2026-02-17T00:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.644657 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.644726 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.644745 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.644769 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.644786 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:46Z","lastTransitionTime":"2026-02-17T00:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.747995 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.748068 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.748086 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.748218 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.748250 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:46Z","lastTransitionTime":"2026-02-17T00:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.850747 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.850808 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.850832 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.850861 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.850883 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:46Z","lastTransitionTime":"2026-02-17T00:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.953806 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.953869 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.953889 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.953913 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.953930 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:46Z","lastTransitionTime":"2026-02-17T00:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.056551 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.056615 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.056636 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.056660 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.056676 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:47Z","lastTransitionTime":"2026-02-17T00:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.160582 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.160651 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.160664 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.160679 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.161280 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:47Z","lastTransitionTime":"2026-02-17T00:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.193512 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 01:06:18.836079387 +0000 UTC Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.219994 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.220033 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.220058 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:47 crc kubenswrapper[4791]: E0217 00:06:47.220206 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:47 crc kubenswrapper[4791]: E0217 00:06:47.220333 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.220382 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:47 crc kubenswrapper[4791]: E0217 00:06:47.220578 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:06:47 crc kubenswrapper[4791]: E0217 00:06:47.220671 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.264598 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.264644 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.264661 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.264684 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.264700 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:47Z","lastTransitionTime":"2026-02-17T00:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.368246 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.368297 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.368321 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.368349 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.368369 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:47Z","lastTransitionTime":"2026-02-17T00:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.470657 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.470698 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.470707 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.470723 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.470732 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:47Z","lastTransitionTime":"2026-02-17T00:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.574516 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.574600 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.574624 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.574658 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.574685 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:47Z","lastTransitionTime":"2026-02-17T00:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.677472 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.677510 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.677521 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.677535 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.677545 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:47Z","lastTransitionTime":"2026-02-17T00:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.780186 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.780219 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.780228 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.780242 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.780253 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:47Z","lastTransitionTime":"2026-02-17T00:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.882490 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.882527 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.882537 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.882555 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.882566 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:47Z","lastTransitionTime":"2026-02-17T00:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.984593 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.984631 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.984643 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.984659 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.984671 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:47Z","lastTransitionTime":"2026-02-17T00:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.087218 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.087244 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.087252 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.087266 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.087275 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:48Z","lastTransitionTime":"2026-02-17T00:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.188743 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.188770 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.188778 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.188789 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.188797 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:48Z","lastTransitionTime":"2026-02-17T00:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.194092 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 03:00:06.599822894 +0000 UTC Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.292065 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.292172 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.292191 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.292249 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.292270 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:48Z","lastTransitionTime":"2026-02-17T00:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.395517 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.395574 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.395591 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.395615 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.395632 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:48Z","lastTransitionTime":"2026-02-17T00:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.499468 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.499517 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.499529 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.499546 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.499557 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:48Z","lastTransitionTime":"2026-02-17T00:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.602164 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.602253 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.602271 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.602294 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.602311 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:48Z","lastTransitionTime":"2026-02-17T00:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.704354 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.704402 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.704414 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.704432 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.704444 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:48Z","lastTransitionTime":"2026-02-17T00:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.806659 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.806702 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.806714 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.806732 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.806745 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:48Z","lastTransitionTime":"2026-02-17T00:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.908869 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.908923 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.908940 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.908963 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.908980 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:48Z","lastTransitionTime":"2026-02-17T00:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.011839 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.011880 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.011891 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.011908 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.011920 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:49Z","lastTransitionTime":"2026-02-17T00:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.113897 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.113920 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.113928 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.113941 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.113950 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:49Z","lastTransitionTime":"2026-02-17T00:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.195151 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 11:59:12.358390211 +0000 UTC Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.215910 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.215933 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.215941 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.215953 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.215963 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:49Z","lastTransitionTime":"2026-02-17T00:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.220309 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.220339 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:49 crc kubenswrapper[4791]: E0217 00:06:49.220427 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.220387 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.220387 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:49 crc kubenswrapper[4791]: E0217 00:06:49.220563 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:49 crc kubenswrapper[4791]: E0217 00:06:49.220618 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:49 crc kubenswrapper[4791]: E0217 00:06:49.220726 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.318505 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.318543 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.318553 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.318569 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.318580 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:49Z","lastTransitionTime":"2026-02-17T00:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.420784 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.420821 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.420833 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.420851 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.420863 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:49Z","lastTransitionTime":"2026-02-17T00:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.522623 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.522658 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.522665 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.522678 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.522687 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:49Z","lastTransitionTime":"2026-02-17T00:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.612983 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs\") pod \"network-metrics-daemon-6x28n\" (UID: \"1d97cf45-2324-494c-839f-6f264eba3828\") " pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:49 crc kubenswrapper[4791]: E0217 00:06:49.613154 4791 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 00:06:49 crc kubenswrapper[4791]: E0217 00:06:49.613203 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs podName:1d97cf45-2324-494c-839f-6f264eba3828 nodeName:}" failed. No retries permitted until 2026-02-17 00:07:21.613190969 +0000 UTC m=+99.092703496 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs") pod "network-metrics-daemon-6x28n" (UID: "1d97cf45-2324-494c-839f-6f264eba3828") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.624596 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.624627 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.624636 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.624650 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.624659 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:49Z","lastTransitionTime":"2026-02-17T00:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.726603 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.726665 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.726688 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.726713 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.726732 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:49Z","lastTransitionTime":"2026-02-17T00:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.829788 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.829844 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.829857 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.829878 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.829889 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:49Z","lastTransitionTime":"2026-02-17T00:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.932063 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.932118 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.932128 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.932144 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.932154 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:49Z","lastTransitionTime":"2026-02-17T00:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.035118 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.035183 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.035196 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.035238 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.035253 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:50Z","lastTransitionTime":"2026-02-17T00:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.137800 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.137849 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.137860 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.137876 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.137887 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:50Z","lastTransitionTime":"2026-02-17T00:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.196221 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 10:30:55.297314149 +0000 UTC Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.239556 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.239583 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.239594 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.239606 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.239615 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:50Z","lastTransitionTime":"2026-02-17T00:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.254955 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.254989 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.255000 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.255017 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.255029 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:50Z","lastTransitionTime":"2026-02-17T00:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:50 crc kubenswrapper[4791]: E0217 00:06:50.270580 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.274452 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.274618 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.274637 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.274662 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.274678 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:50Z","lastTransitionTime":"2026-02-17T00:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:50 crc kubenswrapper[4791]: E0217 00:06:50.286155 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.289538 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.289569 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.289578 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.289592 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.289602 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:50Z","lastTransitionTime":"2026-02-17T00:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:50 crc kubenswrapper[4791]: E0217 00:06:50.304211 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.307431 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.307480 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.307495 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.307514 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.307527 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:50Z","lastTransitionTime":"2026-02-17T00:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:50 crc kubenswrapper[4791]: E0217 00:06:50.320219 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.323647 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.323688 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.323704 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.323719 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.323728 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:50Z","lastTransitionTime":"2026-02-17T00:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:50 crc kubenswrapper[4791]: E0217 00:06:50.336599 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:50 crc kubenswrapper[4791]: E0217 00:06:50.336703 4791 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.342087 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.342144 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.342156 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.342176 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.342190 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:50Z","lastTransitionTime":"2026-02-17T00:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.444790 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.444824 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.444833 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.444848 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.444856 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:50Z","lastTransitionTime":"2026-02-17T00:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.547396 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.547442 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.547477 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.547494 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.547506 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:50Z","lastTransitionTime":"2026-02-17T00:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.649660 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.649701 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.649712 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.649727 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.649739 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:50Z","lastTransitionTime":"2026-02-17T00:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.739819 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-299s7_1104c109-74aa-4fc4-8a1b-914a0d5803a4/kube-multus/0.log" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.739862 4791 generic.go:334] "Generic (PLEG): container finished" podID="1104c109-74aa-4fc4-8a1b-914a0d5803a4" containerID="de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7" exitCode=1 Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.739887 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-299s7" event={"ID":"1104c109-74aa-4fc4-8a1b-914a0d5803a4","Type":"ContainerDied","Data":"de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7"} Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.740234 4791 scope.go:117] "RemoveContainer" containerID="de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.751547 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.751721 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.751729 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.751741 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.751750 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:50Z","lastTransitionTime":"2026-02-17T00:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.759969 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.779452 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.792178 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.804008 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.814137 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.822646 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.831842 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1832e521-1715-432d-917c-bc0ab725e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6482797ad75c032ff3c59d7bc47afd9c215720de8bb76506421ebb686ee29800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f54cf1746cd7a9cc213dadaf7dc256f06b16d9062fba3ddcfc389cb2ff5dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8tdlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.851314 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.854015 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.854043 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.854053 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.854068 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.854081 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:50Z","lastTransitionTime":"2026-02-17T00:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.862042 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.872759 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"2026-02-17T00:06:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3ea86518-6146-4794-a40e-3e62aefbaa8b\\\\n2026-02-17T00:06:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3ea86518-6146-4794-a40e-3e62aefbaa8b to /host/opt/cni/bin/\\\\n2026-02-17T00:06:05Z [verbose] multus-daemon started\\\\n2026-02-17T00:06:05Z [verbose] Readiness Indicator file check\\\\n2026-02-17T00:06:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.903167 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.937766 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:30Z\\\",\\\"message\\\":\\\"0] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 00:06:30.321001 6400 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:30.321058 6400 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 00:06:30.321081 6400 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 00:06:30.321088 6400 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0217 00:06:30.321144 6400 factory.go:656] Stopping watch factory\\\\nI0217 00:06:30.321180 6400 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 00:06:30.321189 6400 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 00:06:30.321197 6400 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 00:06:30.321204 6400 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 00:06:30.321211 6400 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 00:06:30.321226 6400 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 00:06:30.321225 6400 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:30.321498 6400 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.948674 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.955623 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.955648 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.955656 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.955668 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.955676 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:50Z","lastTransitionTime":"2026-02-17T00:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.960183 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6x28n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d97cf45-2324-494c-839f-6f264eba3828\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6x28n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.970170 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"307ada4d-5f23-4c2f-b915-110f48b354f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0271da50c7247334982fae6e28a6f872507b254be16d44d16caa4dd8803c55d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5862918ee8e19c933c26da9af8cabcfb3a381255459915f89d35f97b7dcb646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://273489c73541499e670244ce9e3deb5f726a4b53b67e1d2ff4157c78a5890846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e521383a6bc1b3d1508d862e6d950187345c86d8d464472a626463244acd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e521383a6bc1b3d1508d862e6d950187345c86d8d464472a626463244acd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.983278 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.000381 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.018915 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:51Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.058498 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.058544 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.058560 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.058583 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.058600 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:51Z","lastTransitionTime":"2026-02-17T00:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.161597 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.161659 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.161681 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.161707 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.161728 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:51Z","lastTransitionTime":"2026-02-17T00:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.196493 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 00:55:10.920588291 +0000 UTC Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.219914 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.219991 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.220057 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.220084 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:51 crc kubenswrapper[4791]: E0217 00:06:51.220078 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:51 crc kubenswrapper[4791]: E0217 00:06:51.220220 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:51 crc kubenswrapper[4791]: E0217 00:06:51.220262 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:06:51 crc kubenswrapper[4791]: E0217 00:06:51.220306 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.263847 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.263930 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.263953 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.263984 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.264008 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:51Z","lastTransitionTime":"2026-02-17T00:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.365753 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.365788 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.365797 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.365810 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.365820 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:51Z","lastTransitionTime":"2026-02-17T00:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.467556 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.467592 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.467604 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.467620 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.467632 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:51Z","lastTransitionTime":"2026-02-17T00:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.569819 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.569851 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.569859 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.569874 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.569883 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:51Z","lastTransitionTime":"2026-02-17T00:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.672030 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.672065 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.672076 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.672091 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.672102 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:51Z","lastTransitionTime":"2026-02-17T00:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.744366 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-299s7_1104c109-74aa-4fc4-8a1b-914a0d5803a4/kube-multus/0.log" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.744417 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-299s7" event={"ID":"1104c109-74aa-4fc4-8a1b-914a0d5803a4","Type":"ContainerStarted","Data":"6db675f531aa30ad1f83255bf72a659dd7e60aaecdd681515208407c414c903c"} Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.773579 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.773605 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.773616 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.773630 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.773642 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:51Z","lastTransitionTime":"2026-02-17T00:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.785199 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:51Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.802163 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1832e521-1715-432d-917c-bc0ab725e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6482797ad75c032ff3c59d7bc47afd9c215720de8bb76506421ebb686ee29800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f54cf1746cd7a9cc213dadaf7dc256f06b16d9062fba3ddcfc389cb2ff5dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8tdlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:51Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.821341 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:51Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.833090 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:51Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.842379 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:51Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.859595 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:51Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.877004 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:51Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.878454 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.878473 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.878480 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.878494 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.878503 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:51Z","lastTransitionTime":"2026-02-17T00:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.886435 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:51Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.895099 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6x28n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d97cf45-2324-494c-839f-6f264eba3828\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6x28n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:51Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.928295 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:51Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.943050 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:51Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.958580 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db675f531aa30ad1f83255bf72a659dd7e60aaecdd681515208407c414c903c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"2026-02-17T00:06:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3ea86518-6146-4794-a40e-3e62aefbaa8b\\\\n2026-02-17T00:06:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3ea86518-6146-4794-a40e-3e62aefbaa8b to /host/opt/cni/bin/\\\\n2026-02-17T00:06:05Z [verbose] multus-daemon started\\\\n2026-02-17T00:06:05Z [verbose] Readiness Indicator file check\\\\n2026-02-17T00:06:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:51Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.973941 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:51Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.981263 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.981298 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.981309 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.981327 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.981338 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:51Z","lastTransitionTime":"2026-02-17T00:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.997215 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:30Z\\\",\\\"message\\\":\\\"0] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 00:06:30.321001 6400 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:30.321058 6400 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 00:06:30.321081 6400 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 00:06:30.321088 6400 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0217 00:06:30.321144 6400 factory.go:656] Stopping watch factory\\\\nI0217 00:06:30.321180 6400 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 00:06:30.321189 6400 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 00:06:30.321197 6400 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 00:06:30.321204 6400 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 00:06:30.321211 6400 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 00:06:30.321226 6400 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 00:06:30.321225 6400 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:30.321498 6400 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:51Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.009950 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"307ada4d-5f23-4c2f-b915-110f48b354f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0271da50c7247334982fae6e28a6f872507b254be16d44d16caa4dd8803c55d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5862918ee8e19c933c26da9af8cabcfb3a381255459915f89d35f97b7dcb646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://273489c73541499e670244ce9e3deb5f726a4b53b67e1d2ff4157c78a5890846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e521383a6bc1b3d1508d862e6d950187345c86d8d464472a626463244acd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e521383a6bc1b3d1508d862e6d950187345c86d8d464472a626463244acd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:52Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.021876 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:52Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.033319 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:52Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.046281 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:52Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.083749 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.083778 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.083791 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.083808 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.083819 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:52Z","lastTransitionTime":"2026-02-17T00:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.185710 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.185739 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.185752 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.185767 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.185778 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:52Z","lastTransitionTime":"2026-02-17T00:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.197182 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 04:00:32.036674482 +0000 UTC Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.288189 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.288226 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.288237 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.288253 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.288266 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:52Z","lastTransitionTime":"2026-02-17T00:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.390432 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.390486 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.390503 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.390524 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.390542 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:52Z","lastTransitionTime":"2026-02-17T00:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.492768 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.492807 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.492819 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.492835 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.492846 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:52Z","lastTransitionTime":"2026-02-17T00:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.594629 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.594661 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.594670 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.594684 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.594693 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:52Z","lastTransitionTime":"2026-02-17T00:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.697031 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.697099 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.697234 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.697265 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.697288 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:52Z","lastTransitionTime":"2026-02-17T00:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.799382 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.799429 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.799438 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.799451 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.799460 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:52Z","lastTransitionTime":"2026-02-17T00:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.901892 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.901949 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.901970 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.901998 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.902018 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:52Z","lastTransitionTime":"2026-02-17T00:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.004657 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.004702 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.004717 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.004732 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.004742 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:53Z","lastTransitionTime":"2026-02-17T00:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.109959 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.110022 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.110032 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.110052 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.110064 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:53Z","lastTransitionTime":"2026-02-17T00:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.198194 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 17:57:49.76624103 +0000 UTC Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.211731 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.211763 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.211772 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.211787 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.211797 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:53Z","lastTransitionTime":"2026-02-17T00:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.219362 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.219457 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.219510 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:53 crc kubenswrapper[4791]: E0217 00:06:53.219593 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.219767 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:53 crc kubenswrapper[4791]: E0217 00:06:53.219840 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:06:53 crc kubenswrapper[4791]: E0217 00:06:53.220014 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:53 crc kubenswrapper[4791]: E0217 00:06:53.220202 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.229821 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.239867 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:53Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.254485 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:53Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.270092 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:53Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.295373 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:53Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.310746 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:53Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.314489 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.314549 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.314566 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.314597 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.314615 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:53Z","lastTransitionTime":"2026-02-17T00:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.323542 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:53Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.340329 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1832e521-1715-432d-917c-bc0ab725e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6482797ad75c032ff3c59d7bc47afd9c215720de8bb76506421ebb686ee29800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f54cf1746cd7a9cc213dadaf7dc256f06b16d9062fba3ddcfc389cb2ff5dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8tdlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:53Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.360752 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:53Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.375272 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:53Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.395379 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db675f531aa30ad1f83255bf72a659dd7e60aaecdd681515208407c414c903c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"2026-02-17T00:06:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3ea86518-6146-4794-a40e-3e62aefbaa8b\\\\n2026-02-17T00:06:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3ea86518-6146-4794-a40e-3e62aefbaa8b to /host/opt/cni/bin/\\\\n2026-02-17T00:06:05Z [verbose] multus-daemon started\\\\n2026-02-17T00:06:05Z [verbose] Readiness Indicator file check\\\\n2026-02-17T00:06:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:53Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.412553 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:53Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.416812 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.416846 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.416861 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.416880 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.416896 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:53Z","lastTransitionTime":"2026-02-17T00:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.432621 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:30Z\\\",\\\"message\\\":\\\"0] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 00:06:30.321001 6400 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:30.321058 6400 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 00:06:30.321081 6400 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 00:06:30.321088 6400 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0217 00:06:30.321144 6400 factory.go:656] Stopping watch factory\\\\nI0217 00:06:30.321180 6400 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 00:06:30.321189 6400 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 00:06:30.321197 6400 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 00:06:30.321204 6400 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 00:06:30.321211 6400 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 00:06:30.321226 6400 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 00:06:30.321225 6400 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:30.321498 6400 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:53Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.445647 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:53Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.459256 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6x28n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d97cf45-2324-494c-839f-6f264eba3828\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6x28n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:53Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.476499 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:53Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.494971 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:53Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.511262 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:53Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.520423 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.520478 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.520497 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.520518 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.520533 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:53Z","lastTransitionTime":"2026-02-17T00:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.524153 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"307ada4d-5f23-4c2f-b915-110f48b354f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0271da50c7247334982fae6e28a6f872507b254be16d44d16caa4dd8803c55d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5862918ee8e19c933c26da9af8cabcfb3a381255459915f89d35f97b7dcb646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://273489c73541499e670244ce9e3deb5f726a4b53b67e1d2ff4157c78a5890846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e521383a6bc1b3d1508d862e6d950187345c86d8d464472a626463244acd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e521383a6bc1b3d1508d862e6d950187345c86d8d464472a626463244acd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:53Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.623366 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.623428 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.623451 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.623497 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.623523 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:53Z","lastTransitionTime":"2026-02-17T00:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.726134 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.726167 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.726176 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.726189 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.726198 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:53Z","lastTransitionTime":"2026-02-17T00:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.828664 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.828707 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.828716 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.828742 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.828752 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:53Z","lastTransitionTime":"2026-02-17T00:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.931075 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.931167 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.931187 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.931211 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.931229 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:53Z","lastTransitionTime":"2026-02-17T00:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.033697 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.033745 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.033764 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.033788 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.033806 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:54Z","lastTransitionTime":"2026-02-17T00:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.136045 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.136090 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.136112 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.136134 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.136178 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:54Z","lastTransitionTime":"2026-02-17T00:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.227761 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 12:32:53.277012515 +0000 UTC Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.229018 4791 scope.go:117] "RemoveContainer" containerID="1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.238454 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.238498 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.238517 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.238538 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.238555 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:54Z","lastTransitionTime":"2026-02-17T00:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.341514 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.341567 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.341585 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.341611 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.341627 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:54Z","lastTransitionTime":"2026-02-17T00:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.444127 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.444223 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.444247 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.444277 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.444300 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:54Z","lastTransitionTime":"2026-02-17T00:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.546569 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.546613 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.546625 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.546643 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.546654 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:54Z","lastTransitionTime":"2026-02-17T00:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.648741 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.648813 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.648839 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.648872 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.648894 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:54Z","lastTransitionTime":"2026-02-17T00:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.750763 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.750795 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.750804 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.750817 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.750825 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:54Z","lastTransitionTime":"2026-02-17T00:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.754488 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hldzt_e7fe508f-1e8c-4da7-8f99-108e73cb3791/ovnkube-controller/2.log" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.757325 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerStarted","Data":"b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644"} Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.757734 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.771829 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.781270 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.790848 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.801832 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.811943 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.824167 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1832e521-1715-432d-917c-bc0ab725e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6482797ad75c032ff3c59d7bc47afd9c215720de8bb76506421ebb686ee29800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f54cf1746cd7a9cc213dadaf7dc256f06b16d9062fba3ddcfc389cb2ff5dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8tdlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.838390 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.851273 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.852785 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.852837 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.852846 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.852862 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.852871 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:54Z","lastTransitionTime":"2026-02-17T00:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.869297 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db675f531aa30ad1f83255bf72a659dd7e60aaecdd681515208407c414c903c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"2026-02-17T00:06:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3ea86518-6146-4794-a40e-3e62aefbaa8b\\\\n2026-02-17T00:06:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3ea86518-6146-4794-a40e-3e62aefbaa8b to /host/opt/cni/bin/\\\\n2026-02-17T00:06:05Z [verbose] multus-daemon started\\\\n2026-02-17T00:06:05Z [verbose] Readiness Indicator file check\\\\n2026-02-17T00:06:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.883993 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.900993 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:30Z\\\",\\\"message\\\":\\\"0] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 00:06:30.321001 6400 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:30.321058 6400 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 00:06:30.321081 6400 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 00:06:30.321088 6400 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0217 00:06:30.321144 6400 factory.go:656] Stopping watch factory\\\\nI0217 00:06:30.321180 6400 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 00:06:30.321189 6400 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 00:06:30.321197 6400 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 00:06:30.321204 6400 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 00:06:30.321211 6400 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 00:06:30.321226 6400 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 00:06:30.321225 6400 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:30.321498 6400 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.910665 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.922252 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6x28n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d97cf45-2324-494c-839f-6f264eba3828\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6x28n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.942995 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.952329 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d9e8d58-ba46-4194-9f8d-b06c8c126552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ded890a1b2fba38ade24ab70206558ce4a87f423c6480bf56ef3e6836dc45e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9735d49b4f207d4da5f762dc10e55c7747f8af743a7095c32ef0c28331edc6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9735d49b4f207d4da5f762dc10e55c7747f8af743a7095c32ef0c28331edc6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.954738 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.954763 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.954771 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.954784 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.954794 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:54Z","lastTransitionTime":"2026-02-17T00:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.963815 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.973985 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.983459 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"307ada4d-5f23-4c2f-b915-110f48b354f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0271da50c7247334982fae6e28a6f872507b254be16d44d16caa4dd8803c55d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5862918ee8e19c933c26da9af8cabcfb3a381255459915f89d35f97b7dcb646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://273489c73541499e670244ce9e3deb5f726a4b53b67e1d2ff4157c78a5890846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e521383a6bc1b3d1508d862e6d950187345c86d8d464472a626463244acd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e521383a6bc1b3d1508d862e6d950187345c86d8d464472a626463244acd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.996300 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.056712 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.056746 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.056755 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.056768 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.056777 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:55Z","lastTransitionTime":"2026-02-17T00:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.159670 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.159712 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.159722 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.159737 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.159745 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:55Z","lastTransitionTime":"2026-02-17T00:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.219582 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.219655 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:55 crc kubenswrapper[4791]: E0217 00:06:55.219695 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:55 crc kubenswrapper[4791]: E0217 00:06:55.219803 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.219907 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:55 crc kubenswrapper[4791]: E0217 00:06:55.220075 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.220147 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:55 crc kubenswrapper[4791]: E0217 00:06:55.220294 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.228341 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 01:08:03.36761497 +0000 UTC Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.262934 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.262986 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.263002 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.263025 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.263043 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:55Z","lastTransitionTime":"2026-02-17T00:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.366298 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.366384 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.366406 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.366435 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.366459 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:55Z","lastTransitionTime":"2026-02-17T00:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.470552 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.470623 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.470642 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.470667 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.470684 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:55Z","lastTransitionTime":"2026-02-17T00:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.573547 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.573633 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.573657 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.573691 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.573713 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:55Z","lastTransitionTime":"2026-02-17T00:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.676001 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.676071 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.676089 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.676153 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.676181 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:55Z","lastTransitionTime":"2026-02-17T00:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.761810 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hldzt_e7fe508f-1e8c-4da7-8f99-108e73cb3791/ovnkube-controller/3.log" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.762769 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hldzt_e7fe508f-1e8c-4da7-8f99-108e73cb3791/ovnkube-controller/2.log" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.766733 4791 generic.go:334] "Generic (PLEG): container finished" podID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerID="b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644" exitCode=1 Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.766791 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerDied","Data":"b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644"} Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.766850 4791 scope.go:117] "RemoveContainer" containerID="1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.767783 4791 scope.go:117] "RemoveContainer" containerID="b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644" Feb 17 00:06:55 crc kubenswrapper[4791]: E0217 00:06:55.768014 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.778552 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.778625 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.778639 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.778658 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.778670 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:55Z","lastTransitionTime":"2026-02-17T00:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.783620 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6x28n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d97cf45-2324-494c-839f-6f264eba3828\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6x28n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:55Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.811626 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:55Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.824154 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:55Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.841934 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db675f531aa30ad1f83255bf72a659dd7e60aaecdd681515208407c414c903c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"2026-02-17T00:06:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3ea86518-6146-4794-a40e-3e62aefbaa8b\\\\n2026-02-17T00:06:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3ea86518-6146-4794-a40e-3e62aefbaa8b to /host/opt/cni/bin/\\\\n2026-02-17T00:06:05Z [verbose] multus-daemon started\\\\n2026-02-17T00:06:05Z [verbose] Readiness Indicator file check\\\\n2026-02-17T00:06:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:55Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.859601 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:55Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.880479 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.880660 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.880765 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.880852 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.880926 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:55Z","lastTransitionTime":"2026-02-17T00:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.880787 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:30Z\\\",\\\"message\\\":\\\"0] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 00:06:30.321001 6400 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:30.321058 6400 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 00:06:30.321081 6400 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 00:06:30.321088 6400 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0217 00:06:30.321144 6400 factory.go:656] Stopping watch factory\\\\nI0217 00:06:30.321180 6400 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 00:06:30.321189 6400 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 00:06:30.321197 6400 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 00:06:30.321204 6400 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 00:06:30.321211 6400 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 00:06:30.321226 6400 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 00:06:30.321225 6400 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:30.321498 6400 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:55Z\\\",\\\"message\\\":\\\"/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:55.182069 6791 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0217 00:06:55.182037 6791 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0217 00:06:55.184381 6791 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:55.184540 6791 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:55.184971 6791 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:55.185048 6791 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:55.185070 6791 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:55Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.897061 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:55Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.912711 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"307ada4d-5f23-4c2f-b915-110f48b354f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0271da50c7247334982fae6e28a6f872507b254be16d44d16caa4dd8803c55d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5862918ee8e19c933c26da9af8cabcfb3a381255459915f89d35f97b7dcb646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://273489c73541499e670244ce9e3deb5f726a4b53b67e1d2ff4157c78a5890846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e521383a6bc1b3d1508d862e6d950187345c86d8d464472a626463244acd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e521383a6bc1b3d1508d862e6d950187345c86d8d464472a626463244acd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:55Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.929579 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d9e8d58-ba46-4194-9f8d-b06c8c126552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ded890a1b2fba38ade24ab70206558ce4a87f423c6480bf56ef3e6836dc45e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9735d49b4f207d4da5f762dc10e55c7747f8af743a7095c32ef0c28331edc6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9735d49b4f207d4da5f762dc10e55c7747f8af743a7095c32ef0c28331edc6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:55Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.949214 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:55Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.972307 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:55Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.983371 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.983423 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.983441 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.983465 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.983483 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:55Z","lastTransitionTime":"2026-02-17T00:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.992946 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:55Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.005396 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1832e521-1715-432d-917c-bc0ab725e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6482797ad75c032ff3c59d7bc47afd9c215720de8bb76506421ebb686ee29800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f54cf1746cd7a9cc213dadaf7dc256f06b16d9062fba3ddcfc389cb2ff5dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8tdlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:56Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.023950 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:56Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.036096 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:56Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.045752 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:56Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.055958 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:56Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.069604 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:56Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.079065 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:56Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.086560 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.086599 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.086607 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.086622 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.086632 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:56Z","lastTransitionTime":"2026-02-17T00:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.190307 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.190389 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.190414 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.190449 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.190476 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:56Z","lastTransitionTime":"2026-02-17T00:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.229468 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 08:01:42.343208773 +0000 UTC Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.293039 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.293097 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.293146 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.293171 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.293189 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:56Z","lastTransitionTime":"2026-02-17T00:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.400667 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.400754 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.400777 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.400805 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.400827 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:56Z","lastTransitionTime":"2026-02-17T00:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.503274 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.503312 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.503323 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.503338 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.503350 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:56Z","lastTransitionTime":"2026-02-17T00:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.606147 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.606177 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.606190 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.606206 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.606218 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:56Z","lastTransitionTime":"2026-02-17T00:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.708450 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.708497 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.708513 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.708534 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.708551 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:56Z","lastTransitionTime":"2026-02-17T00:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.770822 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hldzt_e7fe508f-1e8c-4da7-8f99-108e73cb3791/ovnkube-controller/3.log" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.774974 4791 scope.go:117] "RemoveContainer" containerID="b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644" Feb 17 00:06:56 crc kubenswrapper[4791]: E0217 00:06:56.775240 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.792565 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:56Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.808517 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:56Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.811520 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.811552 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.811562 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.811578 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.811589 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:56Z","lastTransitionTime":"2026-02-17T00:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.819670 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:56Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.835223 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:56Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.850358 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:56Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.862449 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:56Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.872155 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1832e521-1715-432d-917c-bc0ab725e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6482797ad75c032ff3c59d7bc47afd9c215720de8bb76506421ebb686ee29800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f54cf1746cd7a9cc213dadaf7dc256f06b16d9062fba3ddcfc389cb2ff5dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8tdlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:56Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.882953 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:56Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.892028 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:56Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.902217 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db675f531aa30ad1f83255bf72a659dd7e60aaecdd681515208407c414c903c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"2026-02-17T00:06:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3ea86518-6146-4794-a40e-3e62aefbaa8b\\\\n2026-02-17T00:06:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3ea86518-6146-4794-a40e-3e62aefbaa8b to /host/opt/cni/bin/\\\\n2026-02-17T00:06:05Z [verbose] multus-daemon started\\\\n2026-02-17T00:06:05Z [verbose] Readiness Indicator file check\\\\n2026-02-17T00:06:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:56Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.911598 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:56Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.913732 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.913775 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.913787 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.913803 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.913812 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:56Z","lastTransitionTime":"2026-02-17T00:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.927215 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:55Z\\\",\\\"message\\\":\\\"/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:55.182069 6791 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0217 00:06:55.182037 6791 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0217 00:06:55.184381 6791 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:55.184540 6791 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:55.184971 6791 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:55.185048 6791 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:55.185070 6791 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:56Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.936404 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:56Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.945902 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6x28n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d97cf45-2324-494c-839f-6f264eba3828\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6x28n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:56Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.966743 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:56Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.979464 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d9e8d58-ba46-4194-9f8d-b06c8c126552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ded890a1b2fba38ade24ab70206558ce4a87f423c6480bf56ef3e6836dc45e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9735d49b4f207d4da5f762dc10e55c7747f8af743a7095c32ef0c28331edc6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9735d49b4f207d4da5f762dc10e55c7747f8af743a7095c32ef0c28331edc6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:56Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.993351 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:56Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.008146 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:57Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.015991 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.016017 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.016029 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.016062 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.016073 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:57Z","lastTransitionTime":"2026-02-17T00:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.018315 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"307ada4d-5f23-4c2f-b915-110f48b354f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0271da50c7247334982fae6e28a6f872507b254be16d44d16caa4dd8803c55d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5862918ee8e19c933c26da9af8cabcfb3a381255459915f89d35f97b7dcb646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://273489c73541499e670244ce9e3deb5f726a4b53b67e1d2ff4157c78a5890846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e521383a6bc1b3d1508d862e6d950187345c86d8d464472a626463244acd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e521383a6bc1b3d1508d862e6d950187345c86d8d464472a626463244acd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:57Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.118646 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.118682 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.118690 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.118705 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.118714 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:57Z","lastTransitionTime":"2026-02-17T00:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.220230 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.220365 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.220264 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.220241 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:57 crc kubenswrapper[4791]: E0217 00:06:57.220537 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:57 crc kubenswrapper[4791]: E0217 00:06:57.220685 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:06:57 crc kubenswrapper[4791]: E0217 00:06:57.220855 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:57 crc kubenswrapper[4791]: E0217 00:06:57.220977 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.222073 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.222098 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.222109 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.222120 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.222140 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:57Z","lastTransitionTime":"2026-02-17T00:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.230461 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 13:33:55.453019621 +0000 UTC Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.325290 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.325433 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.325460 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.325492 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.325517 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:57Z","lastTransitionTime":"2026-02-17T00:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.428608 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.428669 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.428687 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.428710 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.428730 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:57Z","lastTransitionTime":"2026-02-17T00:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.532661 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.532702 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.532715 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.532772 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.532785 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:57Z","lastTransitionTime":"2026-02-17T00:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.635719 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.635791 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.635808 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.635832 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.635856 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:57Z","lastTransitionTime":"2026-02-17T00:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.738995 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.739084 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.739108 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.739160 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.739178 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:57Z","lastTransitionTime":"2026-02-17T00:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.842545 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.842615 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.842638 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.842669 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.842690 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:57Z","lastTransitionTime":"2026-02-17T00:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.946567 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.946615 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.946637 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.946665 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.946687 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:57Z","lastTransitionTime":"2026-02-17T00:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.049323 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.049367 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.049384 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.049405 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.049422 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:58Z","lastTransitionTime":"2026-02-17T00:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.151337 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.151378 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.151394 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.151416 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.151433 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:58Z","lastTransitionTime":"2026-02-17T00:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.230674 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 16:05:38.6996103 +0000 UTC Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.254301 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.254357 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.254374 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.254400 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.254418 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:58Z","lastTransitionTime":"2026-02-17T00:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.357576 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.357907 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.358026 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.358164 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.358292 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:58Z","lastTransitionTime":"2026-02-17T00:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.461509 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.461785 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.461927 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.462097 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.462291 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:58Z","lastTransitionTime":"2026-02-17T00:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.566146 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.566199 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.566218 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.566249 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.566274 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:58Z","lastTransitionTime":"2026-02-17T00:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.669472 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.669540 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.669563 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.669593 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.669617 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:58Z","lastTransitionTime":"2026-02-17T00:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.772738 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.772801 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.772820 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.772845 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.772862 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:58Z","lastTransitionTime":"2026-02-17T00:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.875279 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.875351 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.875378 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.875409 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.875432 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:58Z","lastTransitionTime":"2026-02-17T00:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.979020 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.979089 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.979113 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.979189 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.979209 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:58Z","lastTransitionTime":"2026-02-17T00:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.083099 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.083192 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.083210 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.083235 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.083254 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:59Z","lastTransitionTime":"2026-02-17T00:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.185957 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.186026 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.186043 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.186068 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.186086 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:59Z","lastTransitionTime":"2026-02-17T00:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.220002 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.220055 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.220055 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:59 crc kubenswrapper[4791]: E0217 00:06:59.220288 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.220311 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:59 crc kubenswrapper[4791]: E0217 00:06:59.220431 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:59 crc kubenswrapper[4791]: E0217 00:06:59.220573 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:59 crc kubenswrapper[4791]: E0217 00:06:59.220664 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.231621 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 13:50:16.989418273 +0000 UTC Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.289508 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.289563 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.289588 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.289617 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.289636 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:59Z","lastTransitionTime":"2026-02-17T00:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.393002 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.393063 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.393081 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.393110 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.393150 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:59Z","lastTransitionTime":"2026-02-17T00:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.496282 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.496367 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.496391 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.496417 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.496439 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:59Z","lastTransitionTime":"2026-02-17T00:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.599495 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.599541 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.599551 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.599567 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.599578 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:59Z","lastTransitionTime":"2026-02-17T00:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.701774 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.701839 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.701856 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.701882 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.701900 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:59Z","lastTransitionTime":"2026-02-17T00:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.804200 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.804267 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.804288 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.804314 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.804332 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:59Z","lastTransitionTime":"2026-02-17T00:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.907442 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.907509 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.907525 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.907548 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.907567 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:59Z","lastTransitionTime":"2026-02-17T00:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.010437 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.010493 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.010511 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.010535 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.010557 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:00Z","lastTransitionTime":"2026-02-17T00:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.113210 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.113276 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.113290 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.113310 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.113324 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:00Z","lastTransitionTime":"2026-02-17T00:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.215781 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.215847 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.215868 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.215895 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.215916 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:00Z","lastTransitionTime":"2026-02-17T00:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.232908 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 09:04:28.844530185 +0000 UTC Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.318480 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.318527 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.318543 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.318566 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.318585 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:00Z","lastTransitionTime":"2026-02-17T00:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.418278 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.418353 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.418374 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.418398 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.418416 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:00Z","lastTransitionTime":"2026-02-17T00:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:00 crc kubenswrapper[4791]: E0217 00:07:00.442983 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:00Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.457603 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.457670 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.457707 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.457738 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.457760 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:00Z","lastTransitionTime":"2026-02-17T00:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:00 crc kubenswrapper[4791]: E0217 00:07:00.474930 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:00Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.479783 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.479836 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.479854 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.479878 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.479900 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:00Z","lastTransitionTime":"2026-02-17T00:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:00 crc kubenswrapper[4791]: E0217 00:07:00.494769 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:00Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.499859 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.499930 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.499950 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.499975 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.499997 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:00Z","lastTransitionTime":"2026-02-17T00:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:00 crc kubenswrapper[4791]: E0217 00:07:00.518260 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:00Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.524165 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.524264 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.524290 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.524321 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.524340 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:00Z","lastTransitionTime":"2026-02-17T00:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:00 crc kubenswrapper[4791]: E0217 00:07:00.543198 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:00Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:00 crc kubenswrapper[4791]: E0217 00:07:00.543554 4791 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.545627 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.545705 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.545724 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.545748 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.545766 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:00Z","lastTransitionTime":"2026-02-17T00:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.649058 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.649149 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.649168 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.649196 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.649216 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:00Z","lastTransitionTime":"2026-02-17T00:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.753320 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.753386 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.753407 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.753431 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.753449 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:00Z","lastTransitionTime":"2026-02-17T00:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.856046 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.856138 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.856157 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.856183 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.856200 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:00Z","lastTransitionTime":"2026-02-17T00:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.959721 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.959777 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.959794 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.959818 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.959839 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:00Z","lastTransitionTime":"2026-02-17T00:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.063076 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.063204 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.063228 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.063255 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.063273 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:01Z","lastTransitionTime":"2026-02-17T00:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.166511 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.166572 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.166592 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.166617 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.166635 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:01Z","lastTransitionTime":"2026-02-17T00:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.219792 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.219819 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.219956 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:01 crc kubenswrapper[4791]: E0217 00:07:01.220222 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:01 crc kubenswrapper[4791]: E0217 00:07:01.220388 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:01 crc kubenswrapper[4791]: E0217 00:07:01.220505 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.220863 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:01 crc kubenswrapper[4791]: E0217 00:07:01.220999 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.233326 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 16:47:28.362032751 +0000 UTC Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.272437 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.272512 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.272533 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.272568 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.272590 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:01Z","lastTransitionTime":"2026-02-17T00:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.375944 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.376026 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.376061 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.376086 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.376111 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:01Z","lastTransitionTime":"2026-02-17T00:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.479292 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.479351 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.479369 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.479392 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.479482 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:01Z","lastTransitionTime":"2026-02-17T00:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.582049 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.582161 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.582180 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.582206 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.582224 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:01Z","lastTransitionTime":"2026-02-17T00:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.685891 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.685968 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.685992 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.686024 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.686045 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:01Z","lastTransitionTime":"2026-02-17T00:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.788329 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.788381 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.788399 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.788422 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.788439 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:01Z","lastTransitionTime":"2026-02-17T00:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.891498 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.891556 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.891574 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.891629 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.891649 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:01Z","lastTransitionTime":"2026-02-17T00:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.994699 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.994755 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.994772 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.994798 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.994815 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:01Z","lastTransitionTime":"2026-02-17T00:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.097618 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.097664 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.097677 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.097697 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.097714 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:02Z","lastTransitionTime":"2026-02-17T00:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.201594 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.201634 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.201646 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.201663 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.201675 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:02Z","lastTransitionTime":"2026-02-17T00:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.234339 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 22:06:16.321339476 +0000 UTC Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.304895 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.304984 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.305014 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.305055 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.305087 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:02Z","lastTransitionTime":"2026-02-17T00:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.408762 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.408815 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.408829 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.408848 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.408863 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:02Z","lastTransitionTime":"2026-02-17T00:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.511767 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.511830 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.511845 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.511863 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.512186 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:02Z","lastTransitionTime":"2026-02-17T00:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.615463 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.615569 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.615586 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.616024 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.616066 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:02Z","lastTransitionTime":"2026-02-17T00:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.718870 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.718931 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.718949 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.718971 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.718988 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:02Z","lastTransitionTime":"2026-02-17T00:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.822239 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.822328 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.822344 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.822368 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.822386 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:02Z","lastTransitionTime":"2026-02-17T00:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.925763 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.925841 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.925858 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.926349 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.926398 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:02Z","lastTransitionTime":"2026-02-17T00:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.029612 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.029675 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.029693 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.029718 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.029736 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:03Z","lastTransitionTime":"2026-02-17T00:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.132651 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.132708 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.132725 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.132747 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.132766 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:03Z","lastTransitionTime":"2026-02-17T00:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.219972 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.220091 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.220388 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:03 crc kubenswrapper[4791]: E0217 00:07:03.220371 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.220488 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:03 crc kubenswrapper[4791]: E0217 00:07:03.220703 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:03 crc kubenswrapper[4791]: E0217 00:07:03.220885 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:03 crc kubenswrapper[4791]: E0217 00:07:03.221041 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.235521 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.235582 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.235605 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.235634 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.235656 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:03Z","lastTransitionTime":"2026-02-17T00:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.236316 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 03:56:54.300464626 +0000 UTC Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.240374 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.261142 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1832e521-1715-432d-917c-bc0ab725e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6482797ad75c032ff3c59d7bc47afd9c215720de8bb76506421ebb686ee29800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f54cf1746cd7a9cc213dadaf7dc256f06b16d9062fba3ddcfc389cb2ff5dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8tdlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.284535 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.304097 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.323593 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.339388 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.339445 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.339497 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.339530 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.339549 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:03Z","lastTransitionTime":"2026-02-17T00:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.343624 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.366096 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.382030 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.398611 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6x28n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d97cf45-2324-494c-839f-6f264eba3828\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6x28n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.431462 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.442386 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.442427 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.442445 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.442468 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.442485 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:03Z","lastTransitionTime":"2026-02-17T00:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.451460 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.471303 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db675f531aa30ad1f83255bf72a659dd7e60aaecdd681515208407c414c903c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"2026-02-17T00:06:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3ea86518-6146-4794-a40e-3e62aefbaa8b\\\\n2026-02-17T00:06:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3ea86518-6146-4794-a40e-3e62aefbaa8b to /host/opt/cni/bin/\\\\n2026-02-17T00:06:05Z [verbose] multus-daemon started\\\\n2026-02-17T00:06:05Z [verbose] Readiness Indicator file check\\\\n2026-02-17T00:06:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.488684 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.512774 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:55Z\\\",\\\"message\\\":\\\"/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:55.182069 6791 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0217 00:06:55.182037 6791 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0217 00:06:55.184381 6791 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:55.184540 6791 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:55.184971 6791 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:55.185048 6791 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:55.185070 6791 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.531893 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"307ada4d-5f23-4c2f-b915-110f48b354f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0271da50c7247334982fae6e28a6f872507b254be16d44d16caa4dd8803c55d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5862918ee8e19c933c26da9af8cabcfb3a381255459915f89d35f97b7dcb646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://273489c73541499e670244ce9e3deb5f726a4b53b67e1d2ff4157c78a5890846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e521383a6bc1b3d1508d862e6d950187345c86d8d464472a626463244acd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e521383a6bc1b3d1508d862e6d950187345c86d8d464472a626463244acd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.545702 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.545775 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.545794 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.545814 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.545825 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:03Z","lastTransitionTime":"2026-02-17T00:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.549930 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d9e8d58-ba46-4194-9f8d-b06c8c126552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ded890a1b2fba38ade24ab70206558ce4a87f423c6480bf56ef3e6836dc45e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9735d49b4f207d4da5f762dc10e55c7747f8af743a7095c32ef0c28331edc6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9735d49b4f207d4da5f762dc10e55c7747f8af743a7095c32ef0c28331edc6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.571317 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.588009 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.606498 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.648364 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.648417 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.648436 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.648459 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.648478 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:03Z","lastTransitionTime":"2026-02-17T00:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.751351 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.751412 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.751430 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.751495 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.751516 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:03Z","lastTransitionTime":"2026-02-17T00:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.854167 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.854206 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.854217 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.854257 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.854270 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:03Z","lastTransitionTime":"2026-02-17T00:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.957832 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.957931 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.957949 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.957974 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.957993 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:03Z","lastTransitionTime":"2026-02-17T00:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.060544 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.060607 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.060626 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.060655 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.060674 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:04Z","lastTransitionTime":"2026-02-17T00:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.164642 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.164722 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.164745 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.164777 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.164805 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:04Z","lastTransitionTime":"2026-02-17T00:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.237190 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 00:09:14.774794937 +0000 UTC Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.268039 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.268146 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.268173 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.268205 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.268230 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:04Z","lastTransitionTime":"2026-02-17T00:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.372289 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.372361 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.372379 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.372404 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.372423 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:04Z","lastTransitionTime":"2026-02-17T00:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.476069 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.476179 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.476205 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.476233 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.476255 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:04Z","lastTransitionTime":"2026-02-17T00:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.579360 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.579423 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.579440 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.579467 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.579488 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:04Z","lastTransitionTime":"2026-02-17T00:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.683083 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.683173 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.683190 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.683215 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.683243 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:04Z","lastTransitionTime":"2026-02-17T00:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.786818 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.786885 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.786908 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.786937 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.786958 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:04Z","lastTransitionTime":"2026-02-17T00:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.890032 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.890095 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.890147 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.890177 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.890195 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:04Z","lastTransitionTime":"2026-02-17T00:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.992667 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.992735 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.992754 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.992779 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.992796 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:04Z","lastTransitionTime":"2026-02-17T00:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.096197 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.096260 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.096278 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.096302 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.096320 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:05Z","lastTransitionTime":"2026-02-17T00:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.199026 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.199072 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.199088 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.199103 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.199161 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:05Z","lastTransitionTime":"2026-02-17T00:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.219661 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.219726 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.219741 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.219697 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:05 crc kubenswrapper[4791]: E0217 00:07:05.219871 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:05 crc kubenswrapper[4791]: E0217 00:07:05.219986 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:05 crc kubenswrapper[4791]: E0217 00:07:05.220227 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:05 crc kubenswrapper[4791]: E0217 00:07:05.220341 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.237842 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 18:58:01.226957729 +0000 UTC Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.301980 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.302038 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.302057 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.302081 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.302099 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:05Z","lastTransitionTime":"2026-02-17T00:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.405904 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.405971 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.405989 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.406018 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.406039 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:05Z","lastTransitionTime":"2026-02-17T00:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.510207 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.510272 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.510293 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.510318 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.510337 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:05Z","lastTransitionTime":"2026-02-17T00:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.613779 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.613846 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.613863 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.613893 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.613910 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:05Z","lastTransitionTime":"2026-02-17T00:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.716596 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.716663 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.716687 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.716713 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.716731 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:05Z","lastTransitionTime":"2026-02-17T00:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.819237 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.819301 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.819319 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.819343 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.819361 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:05Z","lastTransitionTime":"2026-02-17T00:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.923422 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.923485 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.923502 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.923524 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.923541 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:05Z","lastTransitionTime":"2026-02-17T00:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.026534 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.026856 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.027092 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.027353 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.027571 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:06Z","lastTransitionTime":"2026-02-17T00:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.130882 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.130937 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.130958 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.130987 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.131009 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:06Z","lastTransitionTime":"2026-02-17T00:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.234406 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.234459 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.234470 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.234488 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.234499 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:06Z","lastTransitionTime":"2026-02-17T00:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.238640 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 23:06:10.047242762 +0000 UTC Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.337091 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.337469 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.337634 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.337833 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.338052 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:06Z","lastTransitionTime":"2026-02-17T00:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.441715 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.441791 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.441809 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.441836 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.441855 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:06Z","lastTransitionTime":"2026-02-17T00:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.544695 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.545223 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.545393 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.545536 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.545679 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:06Z","lastTransitionTime":"2026-02-17T00:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.649199 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.649303 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.649326 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.649356 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.649379 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:06Z","lastTransitionTime":"2026-02-17T00:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.753893 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.753975 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.753998 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.754027 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.754049 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:06Z","lastTransitionTime":"2026-02-17T00:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.856579 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.856662 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.856680 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.856704 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.856722 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:06Z","lastTransitionTime":"2026-02-17T00:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.960172 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.960238 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.960259 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.960285 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.960310 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:06Z","lastTransitionTime":"2026-02-17T00:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.001904 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:07:07 crc kubenswrapper[4791]: E0217 00:07:07.002002 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:11.001979697 +0000 UTC m=+148.481492234 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.063155 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.063233 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.063252 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.063280 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.063300 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:07Z","lastTransitionTime":"2026-02-17T00:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.102861 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.102926 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.102962 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.103017 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:07 crc kubenswrapper[4791]: E0217 00:07:07.103258 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 00:07:07 crc kubenswrapper[4791]: E0217 00:07:07.103282 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 00:07:07 crc kubenswrapper[4791]: E0217 00:07:07.103335 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 00:07:07 crc kubenswrapper[4791]: E0217 00:07:07.103363 4791 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:07:07 crc kubenswrapper[4791]: E0217 00:07:07.103455 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 00:08:11.103424123 +0000 UTC m=+148.582936700 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:07:07 crc kubenswrapper[4791]: E0217 00:07:07.103300 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 00:07:07 crc kubenswrapper[4791]: E0217 00:07:07.103571 4791 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:07:07 crc kubenswrapper[4791]: E0217 00:07:07.103256 4791 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 00:07:07 crc kubenswrapper[4791]: E0217 00:07:07.103256 4791 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 00:07:07 crc kubenswrapper[4791]: E0217 00:07:07.103658 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 00:08:11.10363427 +0000 UTC m=+148.583146847 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:07:07 crc kubenswrapper[4791]: E0217 00:07:07.103702 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 00:08:11.103677431 +0000 UTC m=+148.583189998 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 00:07:07 crc kubenswrapper[4791]: E0217 00:07:07.103727 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 00:08:11.103715133 +0000 UTC m=+148.583227700 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.166223 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.166298 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.166319 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.166344 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.166366 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:07Z","lastTransitionTime":"2026-02-17T00:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.219291 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.219404 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.219308 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:07 crc kubenswrapper[4791]: E0217 00:07:07.219481 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.219403 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:07 crc kubenswrapper[4791]: E0217 00:07:07.219610 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:07 crc kubenswrapper[4791]: E0217 00:07:07.219700 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:07 crc kubenswrapper[4791]: E0217 00:07:07.219847 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.238858 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 11:40:53.327536662 +0000 UTC Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.269520 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.269574 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.269590 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.269613 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.269629 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:07Z","lastTransitionTime":"2026-02-17T00:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.374437 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.374841 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.374866 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.374889 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.374907 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:07Z","lastTransitionTime":"2026-02-17T00:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.478392 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.478476 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.478496 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.478518 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.478617 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:07Z","lastTransitionTime":"2026-02-17T00:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.581272 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.581330 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.581341 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.581359 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.581375 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:07Z","lastTransitionTime":"2026-02-17T00:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.683991 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.684070 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.684137 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.684175 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.684197 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:07Z","lastTransitionTime":"2026-02-17T00:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.787499 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.787562 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.787585 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.787614 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.787637 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:07Z","lastTransitionTime":"2026-02-17T00:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.890265 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.890353 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.890428 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.890454 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.890472 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:07Z","lastTransitionTime":"2026-02-17T00:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.993771 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.993842 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.993864 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.993894 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.993920 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:07Z","lastTransitionTime":"2026-02-17T00:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.097042 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.097096 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.097141 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.097165 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.097184 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:08Z","lastTransitionTime":"2026-02-17T00:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.199960 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.200032 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.200069 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.200099 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.200165 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:08Z","lastTransitionTime":"2026-02-17T00:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.238992 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 06:09:07.088831008 +0000 UTC Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.302955 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.303057 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.303080 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.303149 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.303174 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:08Z","lastTransitionTime":"2026-02-17T00:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.406787 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.406864 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.406888 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.406919 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.406941 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:08Z","lastTransitionTime":"2026-02-17T00:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.510190 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.510277 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.510305 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.510337 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.510360 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:08Z","lastTransitionTime":"2026-02-17T00:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.614074 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.614162 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.614181 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.614203 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.614220 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:08Z","lastTransitionTime":"2026-02-17T00:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.717909 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.717971 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.717989 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.718009 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.718024 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:08Z","lastTransitionTime":"2026-02-17T00:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.820737 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.820849 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.820927 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.821006 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.821039 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:08Z","lastTransitionTime":"2026-02-17T00:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.924839 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.924917 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.924939 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.924970 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.924994 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:08Z","lastTransitionTime":"2026-02-17T00:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.028377 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.028432 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.028450 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.028475 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.028492 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:09Z","lastTransitionTime":"2026-02-17T00:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.131715 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.131776 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.131799 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.131829 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.131853 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:09Z","lastTransitionTime":"2026-02-17T00:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.219830 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.219959 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.220245 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.220289 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:09 crc kubenswrapper[4791]: E0217 00:07:09.220403 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:09 crc kubenswrapper[4791]: E0217 00:07:09.220645 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:09 crc kubenswrapper[4791]: E0217 00:07:09.220974 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:09 crc kubenswrapper[4791]: E0217 00:07:09.221146 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.234193 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.234245 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.234264 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.234290 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.234313 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:09Z","lastTransitionTime":"2026-02-17T00:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.239492 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 14:07:13.268386576 +0000 UTC Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.337834 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.337910 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.338017 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.338057 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.338079 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:09Z","lastTransitionTime":"2026-02-17T00:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.440820 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.440899 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.440924 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.440953 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.440972 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:09Z","lastTransitionTime":"2026-02-17T00:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.544503 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.544568 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.544585 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.544610 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.544629 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:09Z","lastTransitionTime":"2026-02-17T00:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.648191 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.648266 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.648285 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.648311 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.648328 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:09Z","lastTransitionTime":"2026-02-17T00:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.750391 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.750444 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.750456 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.750474 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.750485 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:09Z","lastTransitionTime":"2026-02-17T00:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.853782 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.853858 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.853885 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.853916 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.853942 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:09Z","lastTransitionTime":"2026-02-17T00:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.956752 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.956796 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.956810 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.956827 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.956838 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:09Z","lastTransitionTime":"2026-02-17T00:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.059664 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.059727 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.059745 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.059768 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.059786 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:10Z","lastTransitionTime":"2026-02-17T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.162568 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.162633 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.162661 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.162691 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.162711 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:10Z","lastTransitionTime":"2026-02-17T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.239920 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 20:52:35.573900661 +0000 UTC Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.265520 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.265593 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.265615 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.265644 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.265666 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:10Z","lastTransitionTime":"2026-02-17T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.368567 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.368657 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.368675 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.368704 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.368730 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:10Z","lastTransitionTime":"2026-02-17T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.471718 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.471832 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.471853 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.471883 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.471904 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:10Z","lastTransitionTime":"2026-02-17T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.557871 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.557948 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.557972 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.557999 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.558020 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:10Z","lastTransitionTime":"2026-02-17T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:10 crc kubenswrapper[4791]: E0217 00:07:10.573101 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.577570 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.577609 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.577617 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.577632 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.577640 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:10Z","lastTransitionTime":"2026-02-17T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:10 crc kubenswrapper[4791]: E0217 00:07:10.591637 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.596620 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.596690 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.596701 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.596720 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.596733 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:10Z","lastTransitionTime":"2026-02-17T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:10 crc kubenswrapper[4791]: E0217 00:07:10.610824 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.615246 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.615336 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.615349 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.615370 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.615384 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:10Z","lastTransitionTime":"2026-02-17T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:10 crc kubenswrapper[4791]: E0217 00:07:10.636906 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.641569 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.641749 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.641878 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.642002 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.642151 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:10Z","lastTransitionTime":"2026-02-17T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:10 crc kubenswrapper[4791]: E0217 00:07:10.656558 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:10 crc kubenswrapper[4791]: E0217 00:07:10.656778 4791 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.658820 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.659000 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.659256 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.659400 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.659518 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:10Z","lastTransitionTime":"2026-02-17T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.762267 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.762337 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.762354 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.762379 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.762399 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:10Z","lastTransitionTime":"2026-02-17T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.865560 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.865618 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.865635 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.865659 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.865678 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:10Z","lastTransitionTime":"2026-02-17T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.969201 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.969268 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.969292 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.969322 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.969346 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:10Z","lastTransitionTime":"2026-02-17T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.072619 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.072949 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.073100 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.073334 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.073467 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:11Z","lastTransitionTime":"2026-02-17T00:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.177068 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.177096 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.177124 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.177137 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.177146 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:11Z","lastTransitionTime":"2026-02-17T00:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.220264 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.220407 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.220445 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:11 crc kubenswrapper[4791]: E0217 00:07:11.220610 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.220641 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:11 crc kubenswrapper[4791]: E0217 00:07:11.220787 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:11 crc kubenswrapper[4791]: E0217 00:07:11.220896 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:11 crc kubenswrapper[4791]: E0217 00:07:11.220985 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.240061 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 18:30:07.716223242 +0000 UTC Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.279905 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.280074 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.280099 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.280172 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.280188 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:11Z","lastTransitionTime":"2026-02-17T00:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.383084 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.383179 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.383196 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.383214 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.383226 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:11Z","lastTransitionTime":"2026-02-17T00:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.486064 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.486182 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.486209 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.486238 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.486261 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:11Z","lastTransitionTime":"2026-02-17T00:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.591357 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.591429 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.591448 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.591477 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.591499 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:11Z","lastTransitionTime":"2026-02-17T00:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.694564 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.694634 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.694657 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.694682 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.694704 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:11Z","lastTransitionTime":"2026-02-17T00:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.798029 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.798151 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.798175 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.798206 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.798227 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:11Z","lastTransitionTime":"2026-02-17T00:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.905232 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.905295 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.905313 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.905337 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.905354 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:11Z","lastTransitionTime":"2026-02-17T00:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.008917 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.008990 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.009009 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.009034 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.009052 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:12Z","lastTransitionTime":"2026-02-17T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.112196 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.112249 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.112262 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.112280 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.112291 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:12Z","lastTransitionTime":"2026-02-17T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.214603 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.214690 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.214710 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.214742 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.214782 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:12Z","lastTransitionTime":"2026-02-17T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.221355 4791 scope.go:117] "RemoveContainer" containerID="b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644" Feb 17 00:07:12 crc kubenswrapper[4791]: E0217 00:07:12.221712 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.240232 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 03:40:36.970146783 +0000 UTC Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.318547 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.318605 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.318620 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.318638 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.318653 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:12Z","lastTransitionTime":"2026-02-17T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.421260 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.421304 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.421312 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.421327 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.421339 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:12Z","lastTransitionTime":"2026-02-17T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.523633 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.523699 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.523721 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.523750 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.523773 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:12Z","lastTransitionTime":"2026-02-17T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.627391 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.627450 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.627469 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.627493 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.627513 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:12Z","lastTransitionTime":"2026-02-17T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.731161 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.731237 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.731260 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.731291 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.731316 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:12Z","lastTransitionTime":"2026-02-17T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.834695 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.834818 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.834887 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.834923 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.834948 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:12Z","lastTransitionTime":"2026-02-17T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.938393 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.938459 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.938477 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.938503 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.938522 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:12Z","lastTransitionTime":"2026-02-17T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.042916 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.042977 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.042995 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.043019 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.043039 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:13Z","lastTransitionTime":"2026-02-17T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.150954 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.151026 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.151047 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.151075 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.151094 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:13Z","lastTransitionTime":"2026-02-17T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.219961 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.220031 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.220061 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.219980 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:13 crc kubenswrapper[4791]: E0217 00:07:13.220265 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:13 crc kubenswrapper[4791]: E0217 00:07:13.221486 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:13 crc kubenswrapper[4791]: E0217 00:07:13.221628 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:13 crc kubenswrapper[4791]: E0217 00:07:13.221364 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.241210 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 04:00:47.003695596 +0000 UTC Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.253641 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.253679 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.253691 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.253707 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.253718 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:13Z","lastTransitionTime":"2026-02-17T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.261032 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.278895 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.296001 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db675f531aa30ad1f83255bf72a659dd7e60aaecdd681515208407c414c903c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"2026-02-17T00:06:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3ea86518-6146-4794-a40e-3e62aefbaa8b\\\\n2026-02-17T00:06:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3ea86518-6146-4794-a40e-3e62aefbaa8b to /host/opt/cni/bin/\\\\n2026-02-17T00:06:05Z [verbose] multus-daemon started\\\\n2026-02-17T00:06:05Z [verbose] Readiness Indicator file check\\\\n2026-02-17T00:06:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.310022 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.342437 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:55Z\\\",\\\"message\\\":\\\"/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:55.182069 6791 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0217 00:06:55.182037 6791 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0217 00:06:55.184381 6791 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:55.184540 6791 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:55.184971 6791 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:55.185048 6791 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:55.185070 6791 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.355839 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.356194 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.356368 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.356508 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.356617 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:13Z","lastTransitionTime":"2026-02-17T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.360875 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.376669 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6x28n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d97cf45-2324-494c-839f-6f264eba3828\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6x28n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.393271 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"307ada4d-5f23-4c2f-b915-110f48b354f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0271da50c7247334982fae6e28a6f872507b254be16d44d16caa4dd8803c55d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5862918ee8e19c933c26da9af8cabcfb3a381255459915f89d35f97b7dcb646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://273489c73541499e670244ce9e3deb5f726a4b53b67e1d2ff4157c78a5890846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e521383a6bc1b3d1508d862e6d950187345c86d8d464472a626463244acd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e521383a6bc1b3d1508d862e6d950187345c86d8d464472a626463244acd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.405798 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d9e8d58-ba46-4194-9f8d-b06c8c126552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ded890a1b2fba38ade24ab70206558ce4a87f423c6480bf56ef3e6836dc45e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9735d49b4f207d4da5f762dc10e55c7747f8af743a7095c32ef0c28331edc6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9735d49b4f207d4da5f762dc10e55c7747f8af743a7095c32ef0c28331edc6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.427541 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.447634 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.460376 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.460426 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.460444 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.460470 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.460491 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:13Z","lastTransitionTime":"2026-02-17T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.472151 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.493364 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.509661 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.528097 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.547684 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.565348 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.565403 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.565426 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.565454 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.565472 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:13Z","lastTransitionTime":"2026-02-17T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.566711 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.583690 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.602229 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1832e521-1715-432d-917c-bc0ab725e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6482797ad75c032ff3c59d7bc47afd9c215720de8bb76506421ebb686ee29800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f54cf1746cd7a9cc213dadaf7dc256f06b16d9062fba3ddcfc389cb2ff5dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8tdlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.669590 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.669664 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.669682 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.669707 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.669726 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:13Z","lastTransitionTime":"2026-02-17T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.773002 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.773052 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.773069 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.773092 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.773193 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:13Z","lastTransitionTime":"2026-02-17T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.875222 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.875641 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.875970 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.876298 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.876999 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:13Z","lastTransitionTime":"2026-02-17T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.980049 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.980086 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.980129 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.980152 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.980167 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:13Z","lastTransitionTime":"2026-02-17T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.082917 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.083235 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.083382 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.083516 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.083671 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:14Z","lastTransitionTime":"2026-02-17T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.186414 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.186484 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.186507 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.186539 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.186562 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:14Z","lastTransitionTime":"2026-02-17T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.241484 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 03:31:02.991883163 +0000 UTC Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.322775 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.323208 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.324243 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.324342 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.324375 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:14Z","lastTransitionTime":"2026-02-17T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.426940 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.427003 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.427025 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.427054 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.427078 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:14Z","lastTransitionTime":"2026-02-17T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.530966 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.531341 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.531433 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.531545 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.532139 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:14Z","lastTransitionTime":"2026-02-17T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.635639 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.635683 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.635694 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.635711 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.635722 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:14Z","lastTransitionTime":"2026-02-17T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.738403 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.738465 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.738483 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.738507 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.738541 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:14Z","lastTransitionTime":"2026-02-17T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.841880 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.841954 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.841975 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.842001 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.842023 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:14Z","lastTransitionTime":"2026-02-17T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.945502 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.945571 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.945595 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.945625 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.945651 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:14Z","lastTransitionTime":"2026-02-17T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.048735 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.048793 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.048812 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.048835 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.048860 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:15Z","lastTransitionTime":"2026-02-17T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.151683 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.151812 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.151831 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.151854 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.151872 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:15Z","lastTransitionTime":"2026-02-17T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.220294 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:15 crc kubenswrapper[4791]: E0217 00:07:15.220522 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.220560 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.220576 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.220601 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:15 crc kubenswrapper[4791]: E0217 00:07:15.220729 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:15 crc kubenswrapper[4791]: E0217 00:07:15.220854 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:15 crc kubenswrapper[4791]: E0217 00:07:15.221268 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.242206 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 08:06:26.786521788 +0000 UTC Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.254187 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.254242 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.254253 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.254273 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.254286 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:15Z","lastTransitionTime":"2026-02-17T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.357191 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.357247 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.357264 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.357284 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.357300 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:15Z","lastTransitionTime":"2026-02-17T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.461407 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.461452 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.461463 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.461485 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.461499 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:15Z","lastTransitionTime":"2026-02-17T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.564958 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.565027 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.565045 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.565084 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.565103 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:15Z","lastTransitionTime":"2026-02-17T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.667797 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.667866 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.667889 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.667918 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.667940 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:15Z","lastTransitionTime":"2026-02-17T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.770983 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.771054 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.771075 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.771100 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.771162 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:15Z","lastTransitionTime":"2026-02-17T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.874563 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.874616 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.874634 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.874656 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.874673 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:15Z","lastTransitionTime":"2026-02-17T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.978049 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.978133 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.978151 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.978177 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.978200 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:15Z","lastTransitionTime":"2026-02-17T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.081681 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.081715 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.081729 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.081746 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.081758 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:16Z","lastTransitionTime":"2026-02-17T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.185490 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.185555 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.185573 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.185598 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.185617 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:16Z","lastTransitionTime":"2026-02-17T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.242822 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 20:16:20.912475175 +0000 UTC Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.287693 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.287737 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.287753 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.287777 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.287793 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:16Z","lastTransitionTime":"2026-02-17T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.391289 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.391391 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.391411 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.391435 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.391493 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:16Z","lastTransitionTime":"2026-02-17T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.494172 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.494231 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.494248 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.494274 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.494285 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:16Z","lastTransitionTime":"2026-02-17T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.596691 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.596754 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.596774 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.596799 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.596816 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:16Z","lastTransitionTime":"2026-02-17T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.699295 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.699336 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.699344 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.699358 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.699367 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:16Z","lastTransitionTime":"2026-02-17T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.802026 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.802074 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.802085 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.802100 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.802143 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:16Z","lastTransitionTime":"2026-02-17T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.905178 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.905326 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.905349 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.905377 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.905400 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:16Z","lastTransitionTime":"2026-02-17T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.008476 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.008550 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.008569 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.008596 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.008614 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:17Z","lastTransitionTime":"2026-02-17T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.111886 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.111944 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.111961 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.111990 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.112040 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:17Z","lastTransitionTime":"2026-02-17T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.214312 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.214388 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.214412 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.214442 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.214459 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:17Z","lastTransitionTime":"2026-02-17T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.219714 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.219763 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.219725 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:17 crc kubenswrapper[4791]: E0217 00:07:17.219909 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:17 crc kubenswrapper[4791]: E0217 00:07:17.220147 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.220213 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:17 crc kubenswrapper[4791]: E0217 00:07:17.220479 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:17 crc kubenswrapper[4791]: E0217 00:07:17.220804 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.243081 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 06:51:41.86677935 +0000 UTC Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.317856 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.317920 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.317938 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.317962 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.317980 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:17Z","lastTransitionTime":"2026-02-17T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.422911 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.422967 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.422984 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.423045 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.423104 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:17Z","lastTransitionTime":"2026-02-17T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.526731 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.526794 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.526814 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.526838 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.526855 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:17Z","lastTransitionTime":"2026-02-17T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.629351 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.629424 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.629441 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.629463 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.629481 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:17Z","lastTransitionTime":"2026-02-17T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.732455 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.732523 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.732534 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.732550 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.732683 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:17Z","lastTransitionTime":"2026-02-17T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.835159 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.835217 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.835235 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.835258 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.835274 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:17Z","lastTransitionTime":"2026-02-17T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.937749 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.937815 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.937832 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.937857 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.937878 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:17Z","lastTransitionTime":"2026-02-17T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.040876 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.040938 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.040959 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.040985 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.041004 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:18Z","lastTransitionTime":"2026-02-17T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.143513 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.143585 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.143609 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.143639 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.143660 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:18Z","lastTransitionTime":"2026-02-17T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.243502 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 09:31:38.254267248 +0000 UTC Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.247146 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.247200 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.247268 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.247294 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.247374 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:18Z","lastTransitionTime":"2026-02-17T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.350542 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.350613 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.350633 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.350661 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.350681 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:18Z","lastTransitionTime":"2026-02-17T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.454605 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.454685 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.454702 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.454729 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.454749 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:18Z","lastTransitionTime":"2026-02-17T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.558388 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.558475 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.558510 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.558542 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.558578 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:18Z","lastTransitionTime":"2026-02-17T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.661651 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.661718 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.661737 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.661760 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.661776 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:18Z","lastTransitionTime":"2026-02-17T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.765144 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.765192 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.765231 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.765250 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.765262 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:18Z","lastTransitionTime":"2026-02-17T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.867365 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.867458 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.867482 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.867504 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.867520 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:18Z","lastTransitionTime":"2026-02-17T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.969476 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.969526 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.969543 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.969568 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.969584 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:18Z","lastTransitionTime":"2026-02-17T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.072314 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.072352 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.072370 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.072387 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.072397 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:19Z","lastTransitionTime":"2026-02-17T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.175563 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.175612 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.175632 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.175661 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.175684 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:19Z","lastTransitionTime":"2026-02-17T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.219992 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.220035 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.220069 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.220020 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:19 crc kubenswrapper[4791]: E0217 00:07:19.220219 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:19 crc kubenswrapper[4791]: E0217 00:07:19.220416 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:19 crc kubenswrapper[4791]: E0217 00:07:19.220554 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:19 crc kubenswrapper[4791]: E0217 00:07:19.220715 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.244003 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 23:58:18.298683194 +0000 UTC Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.277776 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.277873 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.277899 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.277928 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.277964 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:19Z","lastTransitionTime":"2026-02-17T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.381460 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.381524 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.381549 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.381584 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.381609 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:19Z","lastTransitionTime":"2026-02-17T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.484934 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.485015 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.485050 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.485085 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.485138 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:19Z","lastTransitionTime":"2026-02-17T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.588536 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.588614 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.588625 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.588642 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.588655 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:19Z","lastTransitionTime":"2026-02-17T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.691224 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.691290 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.691316 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.691348 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.691371 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:19Z","lastTransitionTime":"2026-02-17T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.793716 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.793776 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.793794 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.793817 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.793834 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:19Z","lastTransitionTime":"2026-02-17T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.896864 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.896939 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.896951 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.896970 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.896983 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:19Z","lastTransitionTime":"2026-02-17T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.000472 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.000548 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.000572 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.000597 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.000616 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:20Z","lastTransitionTime":"2026-02-17T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.104942 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.105015 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.105034 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.105058 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.105076 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:20Z","lastTransitionTime":"2026-02-17T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.208094 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.208180 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.208198 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.208222 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.208241 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:20Z","lastTransitionTime":"2026-02-17T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.245131 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 08:35:52.213486624 +0000 UTC Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.311556 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.311654 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.311674 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.312087 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.312173 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:20Z","lastTransitionTime":"2026-02-17T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.420616 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.420677 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.420702 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.420731 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.420751 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:20Z","lastTransitionTime":"2026-02-17T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.523222 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.523288 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.523308 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.523333 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.523352 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:20Z","lastTransitionTime":"2026-02-17T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.625634 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.625904 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.626000 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.626080 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.626164 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:20Z","lastTransitionTime":"2026-02-17T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.729577 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.729650 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.729669 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.729692 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.729709 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:20Z","lastTransitionTime":"2026-02-17T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.833012 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.833180 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.833209 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.833238 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.833263 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:20Z","lastTransitionTime":"2026-02-17T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.936665 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.936726 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.936749 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.936781 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.936803 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:20Z","lastTransitionTime":"2026-02-17T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.971475 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.971535 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.971552 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.971578 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.971597 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:20Z","lastTransitionTime":"2026-02-17T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:20 crc kubenswrapper[4791]: E0217 00:07:20.993088 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.998724 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.999010 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.999248 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.999428 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.999572 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:20Z","lastTransitionTime":"2026-02-17T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:21 crc kubenswrapper[4791]: E0217 00:07:21.022452 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.028367 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.028445 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.028472 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.028502 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.028524 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:21Z","lastTransitionTime":"2026-02-17T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:21 crc kubenswrapper[4791]: E0217 00:07:21.050645 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.056388 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.056438 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.056460 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.056488 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.056509 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:21Z","lastTransitionTime":"2026-02-17T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:21 crc kubenswrapper[4791]: E0217 00:07:21.078499 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.083776 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.083841 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.083862 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.083886 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.083904 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:21Z","lastTransitionTime":"2026-02-17T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:21 crc kubenswrapper[4791]: E0217 00:07:21.105328 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:21 crc kubenswrapper[4791]: E0217 00:07:21.105653 4791 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.107705 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.107769 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.107795 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.107824 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.107845 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:21Z","lastTransitionTime":"2026-02-17T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.210467 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.210535 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.210558 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.210589 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.210611 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:21Z","lastTransitionTime":"2026-02-17T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.220063 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:21 crc kubenswrapper[4791]: E0217 00:07:21.220325 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.220391 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.220460 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.220407 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:21 crc kubenswrapper[4791]: E0217 00:07:21.222223 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:21 crc kubenswrapper[4791]: E0217 00:07:21.222483 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:21 crc kubenswrapper[4791]: E0217 00:07:21.222722 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.245596 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 07:01:50.025828653 +0000 UTC Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.313468 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.313520 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.313538 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.313562 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.313580 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:21Z","lastTransitionTime":"2026-02-17T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.415936 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.415995 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.416012 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.416036 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.416053 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:21Z","lastTransitionTime":"2026-02-17T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.518741 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.518784 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.518795 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.518815 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.518827 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:21Z","lastTransitionTime":"2026-02-17T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.622534 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.622595 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.622613 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.622638 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.622661 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:21Z","lastTransitionTime":"2026-02-17T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.667744 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs\") pod \"network-metrics-daemon-6x28n\" (UID: \"1d97cf45-2324-494c-839f-6f264eba3828\") " pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:21 crc kubenswrapper[4791]: E0217 00:07:21.668043 4791 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 00:07:21 crc kubenswrapper[4791]: E0217 00:07:21.668197 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs podName:1d97cf45-2324-494c-839f-6f264eba3828 nodeName:}" failed. No retries permitted until 2026-02-17 00:08:25.668161528 +0000 UTC m=+163.147674085 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs") pod "network-metrics-daemon-6x28n" (UID: "1d97cf45-2324-494c-839f-6f264eba3828") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.726178 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.726242 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.726266 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.726297 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.726321 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:21Z","lastTransitionTime":"2026-02-17T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.829510 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.829590 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.829630 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.829664 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.829686 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:21Z","lastTransitionTime":"2026-02-17T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.932791 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.932846 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.932864 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.932888 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.932905 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:21Z","lastTransitionTime":"2026-02-17T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.035728 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.035785 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.035803 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.035827 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.035844 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:22Z","lastTransitionTime":"2026-02-17T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.138474 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.138533 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.138555 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.138587 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.138610 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:22Z","lastTransitionTime":"2026-02-17T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.241453 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.241511 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.241526 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.241543 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.241557 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:22Z","lastTransitionTime":"2026-02-17T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.246712 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 20:54:20.32516322 +0000 UTC Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.344609 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.344699 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.344725 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.344756 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.344781 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:22Z","lastTransitionTime":"2026-02-17T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.448257 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.448318 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.448332 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.448393 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.448499 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:22Z","lastTransitionTime":"2026-02-17T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.552468 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.552527 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.552567 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.552595 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.552610 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:22Z","lastTransitionTime":"2026-02-17T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.656650 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.656722 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.656734 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.656756 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.656771 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:22Z","lastTransitionTime":"2026-02-17T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.760978 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.761051 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.761072 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.761104 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.761155 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:22Z","lastTransitionTime":"2026-02-17T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.863741 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.863904 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.863925 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.863953 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.863973 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:22Z","lastTransitionTime":"2026-02-17T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.966798 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.966859 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.966876 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.966900 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.966920 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:22Z","lastTransitionTime":"2026-02-17T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.069401 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.069497 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.069527 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.069561 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.069584 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:23Z","lastTransitionTime":"2026-02-17T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.172980 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.173043 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.173059 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.173082 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.173101 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:23Z","lastTransitionTime":"2026-02-17T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.220361 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.220445 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.220446 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:23 crc kubenswrapper[4791]: E0217 00:07:23.220624 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.220713 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:23 crc kubenswrapper[4791]: E0217 00:07:23.221022 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:23 crc kubenswrapper[4791]: E0217 00:07:23.221179 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:23 crc kubenswrapper[4791]: E0217 00:07:23.221287 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.247606 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 04:38:05.577008609 +0000 UTC Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.248290 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.269157 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.276567 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.276652 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.276671 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.276701 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.276719 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:23Z","lastTransitionTime":"2026-02-17T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.291705 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.341278 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.379831 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.380920 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.380961 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.380975 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.380997 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.381013 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:23Z","lastTransitionTime":"2026-02-17T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.394479 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.405983 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1832e521-1715-432d-917c-bc0ab725e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6482797ad75c032ff3c59d7bc47afd9c215720de8bb76506421ebb686ee29800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f54cf1746cd7a9cc213dadaf7dc256f06b16d9062fba3ddcfc389cb2ff5dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8tdlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.425222 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.437327 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.452553 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db675f531aa30ad1f83255bf72a659dd7e60aaecdd681515208407c414c903c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"2026-02-17T00:06:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3ea86518-6146-4794-a40e-3e62aefbaa8b\\\\n2026-02-17T00:06:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3ea86518-6146-4794-a40e-3e62aefbaa8b to /host/opt/cni/bin/\\\\n2026-02-17T00:06:05Z [verbose] multus-daemon started\\\\n2026-02-17T00:06:05Z [verbose] Readiness Indicator file check\\\\n2026-02-17T00:06:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.472548 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.484402 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.484471 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.484497 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.484535 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.484564 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:23Z","lastTransitionTime":"2026-02-17T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.502578 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:55Z\\\",\\\"message\\\":\\\"/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:55.182069 6791 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0217 00:06:55.182037 6791 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0217 00:06:55.184381 6791 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:55.184540 6791 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:55.184971 6791 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:55.185048 6791 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:55.185070 6791 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.521896 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.539979 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6x28n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d97cf45-2324-494c-839f-6f264eba3828\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6x28n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.558339 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"307ada4d-5f23-4c2f-b915-110f48b354f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0271da50c7247334982fae6e28a6f872507b254be16d44d16caa4dd8803c55d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5862918ee8e19c933c26da9af8cabcfb3a381255459915f89d35f97b7dcb646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://273489c73541499e670244ce9e3deb5f726a4b53b67e1d2ff4157c78a5890846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e521383a6bc1b3d1508d862e6d950187345c86d8d464472a626463244acd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e521383a6bc1b3d1508d862e6d950187345c86d8d464472a626463244acd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.575080 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d9e8d58-ba46-4194-9f8d-b06c8c126552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ded890a1b2fba38ade24ab70206558ce4a87f423c6480bf56ef3e6836dc45e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9735d49b4f207d4da5f762dc10e55c7747f8af743a7095c32ef0c28331edc6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9735d49b4f207d4da5f762dc10e55c7747f8af743a7095c32ef0c28331edc6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.588059 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.588157 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.588177 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.588202 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.588224 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:23Z","lastTransitionTime":"2026-02-17T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.597317 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.617568 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.641331 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.691713 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.691769 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.691788 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.691813 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.691833 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:23Z","lastTransitionTime":"2026-02-17T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.794533 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.794585 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.794597 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.794619 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.794632 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:23Z","lastTransitionTime":"2026-02-17T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.896730 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.896805 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.896831 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.896860 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.896879 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:23Z","lastTransitionTime":"2026-02-17T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.000451 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.000510 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.000528 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.000554 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.000573 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:24Z","lastTransitionTime":"2026-02-17T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.104357 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.104440 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.104459 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.104533 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.104557 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:24Z","lastTransitionTime":"2026-02-17T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.207994 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.208096 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.208183 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.208219 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.208240 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:24Z","lastTransitionTime":"2026-02-17T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.248479 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 02:45:32.286809276 +0000 UTC Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.310929 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.310993 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.311010 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.311033 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.311050 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:24Z","lastTransitionTime":"2026-02-17T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.414189 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.414283 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.414301 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.414327 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.414343 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:24Z","lastTransitionTime":"2026-02-17T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.517166 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.517230 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.517253 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.517277 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.517294 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:24Z","lastTransitionTime":"2026-02-17T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.620596 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.620651 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.620666 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.620690 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.620707 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:24Z","lastTransitionTime":"2026-02-17T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.724283 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.724348 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.724366 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.724390 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.724407 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:24Z","lastTransitionTime":"2026-02-17T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.827921 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.827959 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.827968 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.827982 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.827991 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:24Z","lastTransitionTime":"2026-02-17T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.930729 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.930884 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.930911 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.930938 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.930956 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:24Z","lastTransitionTime":"2026-02-17T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.033980 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.034030 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.034046 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.034071 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.034085 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:25Z","lastTransitionTime":"2026-02-17T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.137417 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.137500 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.137525 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.137560 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.137584 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:25Z","lastTransitionTime":"2026-02-17T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.219991 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.220085 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:25 crc kubenswrapper[4791]: E0217 00:07:25.220191 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.220250 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:25 crc kubenswrapper[4791]: E0217 00:07:25.220439 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.220503 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:25 crc kubenswrapper[4791]: E0217 00:07:25.220556 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:25 crc kubenswrapper[4791]: E0217 00:07:25.220652 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.221710 4791 scope.go:117] "RemoveContainer" containerID="b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644" Feb 17 00:07:25 crc kubenswrapper[4791]: E0217 00:07:25.221956 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.240672 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.240755 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.240778 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.240812 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.240833 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:25Z","lastTransitionTime":"2026-02-17T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.248900 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 01:10:54.284363681 +0000 UTC Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.344062 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.344167 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.344187 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.344215 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.344238 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:25Z","lastTransitionTime":"2026-02-17T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.446968 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.447057 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.447076 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.447100 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.447147 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:25Z","lastTransitionTime":"2026-02-17T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.549851 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.549997 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.550021 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.550076 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.550099 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:25Z","lastTransitionTime":"2026-02-17T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.653495 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.653555 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.653571 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.653598 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.653620 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:25Z","lastTransitionTime":"2026-02-17T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.756364 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.756418 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.756437 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.756460 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.756479 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:25Z","lastTransitionTime":"2026-02-17T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.860029 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.860180 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.860209 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.860238 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.860261 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:25Z","lastTransitionTime":"2026-02-17T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.964010 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.964079 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.964134 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.964167 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.964188 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:25Z","lastTransitionTime":"2026-02-17T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.067036 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.067095 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.067146 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.067174 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.067191 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:26Z","lastTransitionTime":"2026-02-17T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.170517 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.170553 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.170562 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.170578 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.170587 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:26Z","lastTransitionTime":"2026-02-17T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.249421 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 23:31:26.020843408 +0000 UTC Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.274263 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.274321 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.274338 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.274362 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.274381 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:26Z","lastTransitionTime":"2026-02-17T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.376926 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.376992 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.377013 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.377038 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.377057 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:26Z","lastTransitionTime":"2026-02-17T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.480873 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.481010 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.481081 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.481168 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.481197 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:26Z","lastTransitionTime":"2026-02-17T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.584876 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.584929 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.584947 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.584971 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.584989 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:26Z","lastTransitionTime":"2026-02-17T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.688295 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.688354 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.688372 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.688396 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.688412 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:26Z","lastTransitionTime":"2026-02-17T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.791437 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.791497 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.791515 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.791542 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.791561 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:26Z","lastTransitionTime":"2026-02-17T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.893761 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.893854 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.893874 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.893937 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.893959 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:26Z","lastTransitionTime":"2026-02-17T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.997340 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.997401 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.997418 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.997441 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.997459 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:26Z","lastTransitionTime":"2026-02-17T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.100552 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.100614 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.100632 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.100656 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.100676 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:27Z","lastTransitionTime":"2026-02-17T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.203736 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.203823 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.203840 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.203866 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.203884 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:27Z","lastTransitionTime":"2026-02-17T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.219345 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.219417 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.219444 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:27 crc kubenswrapper[4791]: E0217 00:07:27.219499 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.219600 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:27 crc kubenswrapper[4791]: E0217 00:07:27.219731 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:27 crc kubenswrapper[4791]: E0217 00:07:27.220083 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:27 crc kubenswrapper[4791]: E0217 00:07:27.220600 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.250360 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 18:46:52.802816865 +0000 UTC Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.307037 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.307099 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.307146 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.307171 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.307187 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:27Z","lastTransitionTime":"2026-02-17T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.410027 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.410127 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.410150 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.410175 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.410193 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:27Z","lastTransitionTime":"2026-02-17T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.513654 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.513721 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.513738 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.513769 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.513787 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:27Z","lastTransitionTime":"2026-02-17T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.617134 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.617198 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.617220 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.617281 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.617306 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:27Z","lastTransitionTime":"2026-02-17T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.720004 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.720057 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.720073 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.720102 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.720167 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:27Z","lastTransitionTime":"2026-02-17T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.823588 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.823637 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.823655 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.823680 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.823700 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:27Z","lastTransitionTime":"2026-02-17T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.926817 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.926857 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.926868 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.926884 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.926896 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:27Z","lastTransitionTime":"2026-02-17T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.029441 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.029510 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.029533 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.029567 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.029588 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:28Z","lastTransitionTime":"2026-02-17T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.133892 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.133967 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.133991 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.134025 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.134050 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:28Z","lastTransitionTime":"2026-02-17T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.237765 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.237826 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.237850 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.237877 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.237898 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:28Z","lastTransitionTime":"2026-02-17T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.250506 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 01:40:06.441116385 +0000 UTC Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.340540 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.340589 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.340601 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.340618 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.340631 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:28Z","lastTransitionTime":"2026-02-17T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.443341 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.443388 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.443400 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.443420 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.443432 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:28Z","lastTransitionTime":"2026-02-17T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.546835 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.546897 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.546916 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.546948 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.546971 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:28Z","lastTransitionTime":"2026-02-17T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.650226 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.650315 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.650341 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.650379 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.650400 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:28Z","lastTransitionTime":"2026-02-17T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.753411 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.753474 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.753495 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.753520 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.753537 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:28Z","lastTransitionTime":"2026-02-17T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.856420 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.856487 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.856505 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.856528 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.856546 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:28Z","lastTransitionTime":"2026-02-17T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.960337 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.960421 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.960440 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.960469 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.960487 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:28Z","lastTransitionTime":"2026-02-17T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.062885 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.062949 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.062967 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.062994 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.063018 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:29Z","lastTransitionTime":"2026-02-17T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.165719 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.165788 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.165808 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.165834 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.165852 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:29Z","lastTransitionTime":"2026-02-17T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.220298 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.220442 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.220348 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:29 crc kubenswrapper[4791]: E0217 00:07:29.220539 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.220375 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:29 crc kubenswrapper[4791]: E0217 00:07:29.220960 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:29 crc kubenswrapper[4791]: E0217 00:07:29.220998 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:29 crc kubenswrapper[4791]: E0217 00:07:29.221041 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.250654 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 02:50:44.719873384 +0000 UTC Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.268897 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.268970 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.268993 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.269027 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.269052 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:29Z","lastTransitionTime":"2026-02-17T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.371948 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.372025 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.372049 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.372082 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.372363 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:29Z","lastTransitionTime":"2026-02-17T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.476703 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.476788 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.476817 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.476854 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.476879 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:29Z","lastTransitionTime":"2026-02-17T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.579484 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.579569 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.579617 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.579651 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.579676 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:29Z","lastTransitionTime":"2026-02-17T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.682366 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.682433 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.682457 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.682487 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.682512 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:29Z","lastTransitionTime":"2026-02-17T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.785653 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.785698 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.785706 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.785720 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.785729 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:29Z","lastTransitionTime":"2026-02-17T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.888615 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.888685 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.888706 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.888730 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.888746 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:29Z","lastTransitionTime":"2026-02-17T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.994600 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.994694 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.994713 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.994736 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.994789 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:29Z","lastTransitionTime":"2026-02-17T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.097205 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.097265 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.097283 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.097306 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.097326 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:30Z","lastTransitionTime":"2026-02-17T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.199806 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.199907 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.199934 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.199964 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.199986 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:30Z","lastTransitionTime":"2026-02-17T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.251784 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 20:19:28.222833851 +0000 UTC Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.302953 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.302988 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.303016 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.303033 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.303042 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:30Z","lastTransitionTime":"2026-02-17T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.406403 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.406480 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.406498 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.406536 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.406554 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:30Z","lastTransitionTime":"2026-02-17T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.509176 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.509246 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.509264 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.509290 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.509307 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:30Z","lastTransitionTime":"2026-02-17T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.612619 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.612986 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.613268 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.613443 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.613580 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:30Z","lastTransitionTime":"2026-02-17T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.717165 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.717269 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.717288 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.717310 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.717329 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:30Z","lastTransitionTime":"2026-02-17T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.820576 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.820971 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.821146 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.821323 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.821470 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:30Z","lastTransitionTime":"2026-02-17T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.924332 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.924388 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.924405 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.924429 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.924447 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:30Z","lastTransitionTime":"2026-02-17T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.027362 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.027429 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.027446 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.027471 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.027488 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:31Z","lastTransitionTime":"2026-02-17T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.131488 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.131547 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.131564 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.131588 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.131605 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:31Z","lastTransitionTime":"2026-02-17T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.220011 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.220194 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:31 crc kubenswrapper[4791]: E0217 00:07:31.220330 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.220442 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.220527 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:31 crc kubenswrapper[4791]: E0217 00:07:31.220889 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:31 crc kubenswrapper[4791]: E0217 00:07:31.221886 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:31 crc kubenswrapper[4791]: E0217 00:07:31.222030 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.235090 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.235233 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.235261 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.235293 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.235317 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:31Z","lastTransitionTime":"2026-02-17T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.252892 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 22:43:11.010150122 +0000 UTC Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.338439 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.338497 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.338515 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.338541 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.338560 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:31Z","lastTransitionTime":"2026-02-17T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.344040 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.344103 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.344146 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.344173 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.344192 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:31Z","lastTransitionTime":"2026-02-17T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.409602 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmvl2"] Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.410466 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmvl2" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.412834 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.413020 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.413187 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.413844 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.479049 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podStartSLOduration=89.479031381 podStartE2EDuration="1m29.479031381s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:07:31.478930968 +0000 UTC m=+108.958443505" watchObservedRunningTime="2026-02-17 00:07:31.479031381 +0000 UTC m=+108.958543908" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.482889 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1-service-ca\") pod \"cluster-version-operator-5c965bbfc6-jmvl2\" (UID: \"4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmvl2" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.482942 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-jmvl2\" (UID: \"4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmvl2" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.483002 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-jmvl2\" (UID: \"4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmvl2" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.483041 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-jmvl2\" (UID: \"4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmvl2" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.483085 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-jmvl2\" (UID: \"4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmvl2" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.495449 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" podStartSLOduration=88.495434056 podStartE2EDuration="1m28.495434056s" podCreationTimestamp="2026-02-17 00:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:07:31.495234549 +0000 UTC m=+108.974747076" watchObservedRunningTime="2026-02-17 00:07:31.495434056 +0000 UTC m=+108.974946583" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.510591 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=89.51055071 podStartE2EDuration="1m29.51055071s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:07:31.510432736 +0000 UTC m=+108.989945253" watchObservedRunningTime="2026-02-17 00:07:31.51055071 +0000 UTC m=+108.990063247" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.526079 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=85.526059165 podStartE2EDuration="1m25.526059165s" podCreationTimestamp="2026-02-17 00:06:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:07:31.526035875 +0000 UTC m=+109.005548412" watchObservedRunningTime="2026-02-17 00:07:31.526059165 +0000 UTC m=+109.005571692" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.542058 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-299s7" podStartSLOduration=89.542035606 podStartE2EDuration="1m29.542035606s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:07:31.541910772 +0000 UTC m=+109.021423309" watchObservedRunningTime="2026-02-17 00:07:31.542035606 +0000 UTC m=+109.021548143" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.583784 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-jmvl2\" (UID: \"4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmvl2" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.583887 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1-service-ca\") pod \"cluster-version-operator-5c965bbfc6-jmvl2\" (UID: \"4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmvl2" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.583919 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-jmvl2\" (UID: \"4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmvl2" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.583969 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-jmvl2\" (UID: \"4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmvl2" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.584004 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-jmvl2\" (UID: \"4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmvl2" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.584045 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-jmvl2\" (UID: \"4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmvl2" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.584080 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-jmvl2\" (UID: \"4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmvl2" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.585164 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1-service-ca\") pod \"cluster-version-operator-5c965bbfc6-jmvl2\" (UID: \"4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmvl2" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.591909 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-jmvl2\" (UID: \"4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmvl2" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.605385 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-jmvl2\" (UID: \"4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmvl2" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.628359 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-k5kxc" podStartSLOduration=89.628340629 podStartE2EDuration="1m29.628340629s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:07:31.608569706 +0000 UTC m=+109.088082273" watchObservedRunningTime="2026-02-17 00:07:31.628340629 +0000 UTC m=+109.107853156" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.658417 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=88.65839696 podStartE2EDuration="1m28.65839696s" podCreationTimestamp="2026-02-17 00:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:07:31.657824681 +0000 UTC m=+109.137337218" watchObservedRunningTime="2026-02-17 00:07:31.65839696 +0000 UTC m=+109.137909487" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.667953 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-dl4gt" podStartSLOduration=89.667931224 podStartE2EDuration="1m29.667931224s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:07:31.667670817 +0000 UTC m=+109.147183334" watchObservedRunningTime="2026-02-17 00:07:31.667931224 +0000 UTC m=+109.147443751" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.728611 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=58.728591026 podStartE2EDuration="58.728591026s" podCreationTimestamp="2026-02-17 00:06:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:07:31.711919663 +0000 UTC m=+109.191432220" watchObservedRunningTime="2026-02-17 00:07:31.728591026 +0000 UTC m=+109.208103563" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.729036 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=38.729029 podStartE2EDuration="38.729029s" podCreationTimestamp="2026-02-17 00:06:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:07:31.728543414 +0000 UTC m=+109.208055951" watchObservedRunningTime="2026-02-17 00:07:31.729029 +0000 UTC m=+109.208541537" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.730132 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmvl2" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.746069 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-8stwf" podStartSLOduration=89.746051325 podStartE2EDuration="1m29.746051325s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:07:31.745857418 +0000 UTC m=+109.225369965" watchObservedRunningTime="2026-02-17 00:07:31.746051325 +0000 UTC m=+109.225563852" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.915798 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmvl2" event={"ID":"4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1","Type":"ContainerStarted","Data":"e54a80eac0abbabaa9da307ea9535e11d2e9952af7df45436c5550d2f1b5c91e"} Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.916167 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmvl2" event={"ID":"4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1","Type":"ContainerStarted","Data":"0eda90be92faad7285114516a7f3ceebb90d582221526bb0138bb395d888ec3b"} Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.946458 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmvl2" podStartSLOduration=89.946434226 podStartE2EDuration="1m29.946434226s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:07:31.946330222 +0000 UTC m=+109.425842799" watchObservedRunningTime="2026-02-17 00:07:31.946434226 +0000 UTC m=+109.425946763" Feb 17 00:07:32 crc kubenswrapper[4791]: I0217 00:07:32.253639 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 06:58:12.708548406 +0000 UTC Feb 17 00:07:32 crc kubenswrapper[4791]: I0217 00:07:32.254698 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 17 00:07:32 crc kubenswrapper[4791]: I0217 00:07:32.266744 4791 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 17 00:07:33 crc kubenswrapper[4791]: I0217 00:07:33.219757 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:33 crc kubenswrapper[4791]: I0217 00:07:33.220370 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:33 crc kubenswrapper[4791]: E0217 00:07:33.222139 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:33 crc kubenswrapper[4791]: I0217 00:07:33.222166 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:33 crc kubenswrapper[4791]: I0217 00:07:33.222208 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:33 crc kubenswrapper[4791]: E0217 00:07:33.222229 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:33 crc kubenswrapper[4791]: E0217 00:07:33.222321 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:33 crc kubenswrapper[4791]: E0217 00:07:33.222475 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:35 crc kubenswrapper[4791]: I0217 00:07:35.219578 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:35 crc kubenswrapper[4791]: I0217 00:07:35.219781 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:35 crc kubenswrapper[4791]: E0217 00:07:35.220612 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:35 crc kubenswrapper[4791]: I0217 00:07:35.220654 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:35 crc kubenswrapper[4791]: I0217 00:07:35.220744 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:35 crc kubenswrapper[4791]: E0217 00:07:35.220882 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:35 crc kubenswrapper[4791]: E0217 00:07:35.221020 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:35 crc kubenswrapper[4791]: E0217 00:07:35.221306 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:36 crc kubenswrapper[4791]: I0217 00:07:36.935599 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-299s7_1104c109-74aa-4fc4-8a1b-914a0d5803a4/kube-multus/1.log" Feb 17 00:07:36 crc kubenswrapper[4791]: I0217 00:07:36.936618 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-299s7_1104c109-74aa-4fc4-8a1b-914a0d5803a4/kube-multus/0.log" Feb 17 00:07:36 crc kubenswrapper[4791]: I0217 00:07:36.936672 4791 generic.go:334] "Generic (PLEG): container finished" podID="1104c109-74aa-4fc4-8a1b-914a0d5803a4" containerID="6db675f531aa30ad1f83255bf72a659dd7e60aaecdd681515208407c414c903c" exitCode=1 Feb 17 00:07:36 crc kubenswrapper[4791]: I0217 00:07:36.936704 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-299s7" event={"ID":"1104c109-74aa-4fc4-8a1b-914a0d5803a4","Type":"ContainerDied","Data":"6db675f531aa30ad1f83255bf72a659dd7e60aaecdd681515208407c414c903c"} Feb 17 00:07:36 crc kubenswrapper[4791]: I0217 00:07:36.936737 4791 scope.go:117] "RemoveContainer" containerID="de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7" Feb 17 00:07:36 crc kubenswrapper[4791]: I0217 00:07:36.937064 4791 scope.go:117] "RemoveContainer" containerID="6db675f531aa30ad1f83255bf72a659dd7e60aaecdd681515208407c414c903c" Feb 17 00:07:36 crc kubenswrapper[4791]: E0217 00:07:36.937257 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-299s7_openshift-multus(1104c109-74aa-4fc4-8a1b-914a0d5803a4)\"" pod="openshift-multus/multus-299s7" podUID="1104c109-74aa-4fc4-8a1b-914a0d5803a4" Feb 17 00:07:37 crc kubenswrapper[4791]: I0217 00:07:37.219702 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:37 crc kubenswrapper[4791]: I0217 00:07:37.219852 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:37 crc kubenswrapper[4791]: I0217 00:07:37.219974 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:37 crc kubenswrapper[4791]: I0217 00:07:37.220068 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:37 crc kubenswrapper[4791]: E0217 00:07:37.220390 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:37 crc kubenswrapper[4791]: E0217 00:07:37.220520 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:37 crc kubenswrapper[4791]: E0217 00:07:37.220681 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:37 crc kubenswrapper[4791]: E0217 00:07:37.221366 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:37 crc kubenswrapper[4791]: I0217 00:07:37.942100 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-299s7_1104c109-74aa-4fc4-8a1b-914a0d5803a4/kube-multus/1.log" Feb 17 00:07:38 crc kubenswrapper[4791]: I0217 00:07:38.221460 4791 scope.go:117] "RemoveContainer" containerID="b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644" Feb 17 00:07:38 crc kubenswrapper[4791]: I0217 00:07:38.946298 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hldzt_e7fe508f-1e8c-4da7-8f99-108e73cb3791/ovnkube-controller/3.log" Feb 17 00:07:38 crc kubenswrapper[4791]: I0217 00:07:38.949254 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerStarted","Data":"ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf"} Feb 17 00:07:38 crc kubenswrapper[4791]: I0217 00:07:38.949661 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:07:38 crc kubenswrapper[4791]: I0217 00:07:38.991171 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" podStartSLOduration=96.991157285 podStartE2EDuration="1m36.991157285s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:07:38.989324066 +0000 UTC m=+116.468836593" watchObservedRunningTime="2026-02-17 00:07:38.991157285 +0000 UTC m=+116.470669812" Feb 17 00:07:39 crc kubenswrapper[4791]: I0217 00:07:39.063506 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6x28n"] Feb 17 00:07:39 crc kubenswrapper[4791]: I0217 00:07:39.063630 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:39 crc kubenswrapper[4791]: E0217 00:07:39.063730 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:39 crc kubenswrapper[4791]: I0217 00:07:39.220274 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:39 crc kubenswrapper[4791]: I0217 00:07:39.220307 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:39 crc kubenswrapper[4791]: I0217 00:07:39.220365 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:39 crc kubenswrapper[4791]: E0217 00:07:39.220474 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:39 crc kubenswrapper[4791]: E0217 00:07:39.220592 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:39 crc kubenswrapper[4791]: E0217 00:07:39.220785 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:41 crc kubenswrapper[4791]: I0217 00:07:41.220273 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:41 crc kubenswrapper[4791]: I0217 00:07:41.220358 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:41 crc kubenswrapper[4791]: I0217 00:07:41.220547 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:41 crc kubenswrapper[4791]: I0217 00:07:41.220274 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:41 crc kubenswrapper[4791]: E0217 00:07:41.220532 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:41 crc kubenswrapper[4791]: E0217 00:07:41.220686 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:41 crc kubenswrapper[4791]: E0217 00:07:41.220798 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:41 crc kubenswrapper[4791]: E0217 00:07:41.220924 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:43 crc kubenswrapper[4791]: E0217 00:07:43.211540 4791 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 17 00:07:43 crc kubenswrapper[4791]: I0217 00:07:43.219872 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:43 crc kubenswrapper[4791]: I0217 00:07:43.219912 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:43 crc kubenswrapper[4791]: I0217 00:07:43.220006 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:43 crc kubenswrapper[4791]: E0217 00:07:43.222053 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:43 crc kubenswrapper[4791]: I0217 00:07:43.222170 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:43 crc kubenswrapper[4791]: E0217 00:07:43.222352 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:43 crc kubenswrapper[4791]: E0217 00:07:43.222441 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:43 crc kubenswrapper[4791]: E0217 00:07:43.222596 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:43 crc kubenswrapper[4791]: E0217 00:07:43.343415 4791 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 00:07:45 crc kubenswrapper[4791]: I0217 00:07:45.220401 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:45 crc kubenswrapper[4791]: I0217 00:07:45.220495 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:45 crc kubenswrapper[4791]: I0217 00:07:45.220498 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:45 crc kubenswrapper[4791]: I0217 00:07:45.220600 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:45 crc kubenswrapper[4791]: E0217 00:07:45.220980 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:45 crc kubenswrapper[4791]: E0217 00:07:45.221288 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:45 crc kubenswrapper[4791]: E0217 00:07:45.221409 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:45 crc kubenswrapper[4791]: E0217 00:07:45.221523 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:47 crc kubenswrapper[4791]: I0217 00:07:47.219846 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:47 crc kubenswrapper[4791]: I0217 00:07:47.220021 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:47 crc kubenswrapper[4791]: E0217 00:07:47.220070 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:47 crc kubenswrapper[4791]: I0217 00:07:47.220176 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:47 crc kubenswrapper[4791]: I0217 00:07:47.220190 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:47 crc kubenswrapper[4791]: E0217 00:07:47.220343 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:47 crc kubenswrapper[4791]: E0217 00:07:47.220532 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:47 crc kubenswrapper[4791]: E0217 00:07:47.220663 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:48 crc kubenswrapper[4791]: E0217 00:07:48.344956 4791 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 00:07:49 crc kubenswrapper[4791]: I0217 00:07:49.220020 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:49 crc kubenswrapper[4791]: I0217 00:07:49.220174 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:49 crc kubenswrapper[4791]: I0217 00:07:49.220230 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:49 crc kubenswrapper[4791]: E0217 00:07:49.220268 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:49 crc kubenswrapper[4791]: I0217 00:07:49.220193 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:49 crc kubenswrapper[4791]: E0217 00:07:49.220516 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:49 crc kubenswrapper[4791]: E0217 00:07:49.220587 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:49 crc kubenswrapper[4791]: E0217 00:07:49.220815 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:50 crc kubenswrapper[4791]: I0217 00:07:50.219858 4791 scope.go:117] "RemoveContainer" containerID="6db675f531aa30ad1f83255bf72a659dd7e60aaecdd681515208407c414c903c" Feb 17 00:07:51 crc kubenswrapper[4791]: I0217 00:07:51.000398 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-299s7_1104c109-74aa-4fc4-8a1b-914a0d5803a4/kube-multus/1.log" Feb 17 00:07:51 crc kubenswrapper[4791]: I0217 00:07:51.000795 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-299s7" event={"ID":"1104c109-74aa-4fc4-8a1b-914a0d5803a4","Type":"ContainerStarted","Data":"583de3e38aa471160a347992af3add013ed63c250a69e768895a5ca5ef559bea"} Feb 17 00:07:51 crc kubenswrapper[4791]: I0217 00:07:51.219886 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:51 crc kubenswrapper[4791]: I0217 00:07:51.219964 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:51 crc kubenswrapper[4791]: E0217 00:07:51.220062 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:51 crc kubenswrapper[4791]: I0217 00:07:51.220122 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:51 crc kubenswrapper[4791]: I0217 00:07:51.220142 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:51 crc kubenswrapper[4791]: E0217 00:07:51.220299 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:51 crc kubenswrapper[4791]: E0217 00:07:51.220479 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:51 crc kubenswrapper[4791]: E0217 00:07:51.220616 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:53 crc kubenswrapper[4791]: I0217 00:07:53.220177 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:53 crc kubenswrapper[4791]: I0217 00:07:53.220277 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:53 crc kubenswrapper[4791]: E0217 00:07:53.222281 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:53 crc kubenswrapper[4791]: I0217 00:07:53.222312 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:53 crc kubenswrapper[4791]: I0217 00:07:53.222467 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:53 crc kubenswrapper[4791]: E0217 00:07:53.222644 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:53 crc kubenswrapper[4791]: E0217 00:07:53.222858 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:53 crc kubenswrapper[4791]: E0217 00:07:53.223015 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:55 crc kubenswrapper[4791]: I0217 00:07:55.219816 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:55 crc kubenswrapper[4791]: I0217 00:07:55.219910 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:55 crc kubenswrapper[4791]: I0217 00:07:55.219833 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:55 crc kubenswrapper[4791]: I0217 00:07:55.220277 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:55 crc kubenswrapper[4791]: I0217 00:07:55.223772 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 17 00:07:55 crc kubenswrapper[4791]: I0217 00:07:55.223930 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 17 00:07:55 crc kubenswrapper[4791]: I0217 00:07:55.223967 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 17 00:07:55 crc kubenswrapper[4791]: I0217 00:07:55.223977 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 17 00:07:55 crc kubenswrapper[4791]: I0217 00:07:55.224168 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 17 00:07:55 crc kubenswrapper[4791]: I0217 00:07:55.224562 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.717059 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.773804 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jftdn"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.774420 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.779831 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.780099 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.780501 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.780548 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qzgkr"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.781388 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qzgkr" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.781938 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.783521 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.784127 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.784699 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.790626 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.791027 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.791431 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.791698 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.791950 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.792179 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.793463 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-rt865"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.793909 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.793993 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-pruner-29521440-k6f7k"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.801836 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-rt865" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.809684 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-flvjk"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.811000 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29521440-k6f7k" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.812331 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-49n75"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.813274 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.814034 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-d7qrh"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.825973 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-49n75" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.826883 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d7qrh" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.827618 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8lhkg"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.828427 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8lhkg" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.828635 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-qxk8k"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.829061 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxk8k" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.829797 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.829904 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.829933 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.830040 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.830063 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"serviceca" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.830304 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"pruner-dockercfg-p7bcw" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.830470 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.830562 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.830671 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.830810 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.831166 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.832270 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r8zpf"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.832907 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.832990 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.834240 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.837717 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-5nwz7"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.838406 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fpr65"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.838775 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fpr65" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.839112 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-5nwz7" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.839218 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.841104 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-t5827"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.841657 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-t5827" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.843028 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-frmbv"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.843613 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.845435 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-stqb9"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.846102 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-stqb9" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.849645 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.849870 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.849938 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.850035 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.850052 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.850095 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.850177 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.850210 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.850283 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.850336 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.850395 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.850426 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.850514 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.850560 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.850610 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.850673 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.850700 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.850766 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.850783 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.850850 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.850864 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.850943 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.850960 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.851101 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.851228 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.851271 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.851232 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.851392 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.851629 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.853347 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jftdn"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.854283 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-j7w5g"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.856231 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.856387 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.856612 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.857826 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wfpf2"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.859195 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.860330 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.870040 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j7w5g" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.870431 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fpv5b"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.871053 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.871177 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.871364 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fpv5b" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.873508 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.874591 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.874861 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.875116 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.875393 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.875692 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.875997 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.876619 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.876956 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.877921 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.878707 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.879090 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.879269 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.881918 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.882049 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.898709 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.898994 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.899065 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.899199 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.899269 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.899359 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.899451 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.899585 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.899641 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.900116 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.900217 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.900273 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.900486 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57qch"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.900886 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fs52s"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.901191 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fs52s" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.901205 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.901400 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57qch" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.902249 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.903495 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.903644 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.903749 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.903763 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.903875 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.904055 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.904473 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.904581 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.909565 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nb5r"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.910078 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nb5r" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.910391 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nlbr"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.910946 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.911102 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.911206 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nlbr" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.911414 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.912343 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kqr99"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.912744 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kqr99" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.914330 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.915251 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.915676 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-7pqm8"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.916306 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-7pqm8" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.924337 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.924753 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.925824 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-vb8fv"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.926009 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.926464 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhmqq"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.926787 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhmqq" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.927032 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vb8fv" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.927088 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bt9mf"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.927458 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935102 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5526b957-e33f-4952-8dda-d2875c94686a-config\") pod \"etcd-operator-b45778765-stqb9\" (UID: \"5526b957-e33f-4952-8dda-d2875c94686a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stqb9" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935160 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d469ce1-e7ed-4826-a378-0de16f2b4e56-config\") pod \"openshift-apiserver-operator-796bbdcf4f-8lhkg\" (UID: \"3d469ce1-e7ed-4826-a378-0de16f2b4e56\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8lhkg" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935186 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/155619c1-12ba-4149-9dce-474e3735168c-oauth-serving-cert\") pod \"console-f9d7485db-frmbv\" (UID: \"155619c1-12ba-4149-9dce-474e3735168c\") " pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935231 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/839b6744-bbe6-4b56-b020-181d86c604fe-auth-proxy-config\") pod \"machine-approver-56656f9798-d7qrh\" (UID: \"839b6744-bbe6-4b56-b020-181d86c604fe\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d7qrh" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935256 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1b1913d4-85d3-4596-acea-6e272cf81e8e-audit\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935272 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/713c3460-f77d-4f7b-81bf-911f8f875dfe-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-jftdn\" (UID: \"713c3460-f77d-4f7b-81bf-911f8f875dfe\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935291 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9c752f56-7754-4718-aea5-cb41d6ac4253-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-5nwz7\" (UID: \"9c752f56-7754-4718-aea5-cb41d6ac4253\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5nwz7" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935310 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/643578b4-75ca-4765-8df5-9167688e3ced-serving-cert\") pod \"apiserver-7bbb656c7d-sgzjl\" (UID: \"643578b4-75ca-4765-8df5-9167688e3ced\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935340 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9pgd\" (UniqueName: \"kubernetes.io/projected/3d469ce1-e7ed-4826-a378-0de16f2b4e56-kube-api-access-m9pgd\") pod \"openshift-apiserver-operator-796bbdcf4f-8lhkg\" (UID: \"3d469ce1-e7ed-4826-a378-0de16f2b4e56\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8lhkg" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935359 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935378 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935397 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1b1913d4-85d3-4596-acea-6e272cf81e8e-etcd-serving-ca\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935413 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5526b957-e33f-4952-8dda-d2875c94686a-etcd-client\") pod \"etcd-operator-b45778765-stqb9\" (UID: \"5526b957-e33f-4952-8dda-d2875c94686a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stqb9" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935435 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnsxm\" (UniqueName: \"kubernetes.io/projected/b7f0fa93-740f-43aa-9350-24d9920a9345-kube-api-access-gnsxm\") pod \"cluster-image-registry-operator-dc59b4c8b-fpr65\" (UID: \"b7f0fa93-740f-43aa-9350-24d9920a9345\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fpr65" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935455 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c752f56-7754-4718-aea5-cb41d6ac4253-config\") pod \"machine-api-operator-5694c8668f-5nwz7\" (UID: \"9c752f56-7754-4718-aea5-cb41d6ac4253\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5nwz7" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935473 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9c752f56-7754-4718-aea5-cb41d6ac4253-images\") pod \"machine-api-operator-5694c8668f-5nwz7\" (UID: \"9c752f56-7754-4718-aea5-cb41d6ac4253\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5nwz7" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935503 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1b1913d4-85d3-4596-acea-6e272cf81e8e-etcd-client\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935521 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/155619c1-12ba-4149-9dce-474e3735168c-console-serving-cert\") pod \"console-f9d7485db-frmbv\" (UID: \"155619c1-12ba-4149-9dce-474e3735168c\") " pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935536 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5526b957-e33f-4952-8dda-d2875c94686a-etcd-ca\") pod \"etcd-operator-b45778765-stqb9\" (UID: \"5526b957-e33f-4952-8dda-d2875c94686a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stqb9" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935558 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b7f0fa93-740f-43aa-9350-24d9920a9345-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-fpr65\" (UID: \"b7f0fa93-740f-43aa-9350-24d9920a9345\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fpr65" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935578 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935599 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgt9z\" (UniqueName: \"kubernetes.io/projected/94401a93-55c7-4e8b-83f7-dc27a876f335-kube-api-access-cgt9z\") pod \"image-pruner-29521440-k6f7k\" (UID: \"94401a93-55c7-4e8b-83f7-dc27a876f335\") " pod="openshift-image-registry/image-pruner-29521440-k6f7k" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935622 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/155619c1-12ba-4149-9dce-474e3735168c-console-oauth-config\") pod \"console-f9d7485db-frmbv\" (UID: \"155619c1-12ba-4149-9dce-474e3735168c\") " pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935640 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/839b6744-bbe6-4b56-b020-181d86c604fe-config\") pod \"machine-approver-56656f9798-d7qrh\" (UID: \"839b6744-bbe6-4b56-b020-181d86c604fe\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d7qrh" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935664 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c70fe9d3-348d-4bb8-89f7-21027041131a-audit-policies\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935686 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/155619c1-12ba-4149-9dce-474e3735168c-service-ca\") pod \"console-f9d7485db-frmbv\" (UID: \"155619c1-12ba-4149-9dce-474e3735168c\") " pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935704 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c70fe9d3-348d-4bb8-89f7-21027041131a-audit-dir\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935724 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8kss\" (UniqueName: \"kubernetes.io/projected/643578b4-75ca-4765-8df5-9167688e3ced-kube-api-access-x8kss\") pod \"apiserver-7bbb656c7d-sgzjl\" (UID: \"643578b4-75ca-4765-8df5-9167688e3ced\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935744 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt7pk\" (UniqueName: \"kubernetes.io/projected/155619c1-12ba-4149-9dce-474e3735168c-kube-api-access-gt7pk\") pod \"console-f9d7485db-frmbv\" (UID: \"155619c1-12ba-4149-9dce-474e3735168c\") " pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935764 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d469ce1-e7ed-4826-a378-0de16f2b4e56-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-8lhkg\" (UID: \"3d469ce1-e7ed-4826-a378-0de16f2b4e56\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8lhkg" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935783 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935802 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6a866a69-9159-4dd1-a03d-b2a0f703fb7b-client-ca\") pod \"route-controller-manager-6576b87f9c-ht455\" (UID: \"6a866a69-9159-4dd1-a03d-b2a0f703fb7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935824 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1b1913d4-85d3-4596-acea-6e272cf81e8e-image-import-ca\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935851 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1b1913d4-85d3-4596-acea-6e272cf81e8e-encryption-config\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935879 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d15f89df-5eaf-48c8-963d-bc3e1c79bd43-config\") pod \"authentication-operator-69f744f599-49n75\" (UID: \"d15f89df-5eaf-48c8-963d-bc3e1c79bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-49n75" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935905 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935921 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/643578b4-75ca-4765-8df5-9167688e3ced-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-sgzjl\" (UID: \"643578b4-75ca-4765-8df5-9167688e3ced\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935943 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d15f89df-5eaf-48c8-963d-bc3e1c79bd43-service-ca-bundle\") pod \"authentication-operator-69f744f599-49n75\" (UID: \"d15f89df-5eaf-48c8-963d-bc3e1c79bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-49n75" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935971 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/155619c1-12ba-4149-9dce-474e3735168c-trusted-ca-bundle\") pod \"console-f9d7485db-frmbv\" (UID: \"155619c1-12ba-4149-9dce-474e3735168c\") " pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935997 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/839b6744-bbe6-4b56-b020-181d86c604fe-machine-approver-tls\") pod \"machine-approver-56656f9798-d7qrh\" (UID: \"839b6744-bbe6-4b56-b020-181d86c604fe\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d7qrh" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.936016 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8zlc\" (UniqueName: \"kubernetes.io/projected/713c3460-f77d-4f7b-81bf-911f8f875dfe-kube-api-access-k8zlc\") pod \"controller-manager-879f6c89f-jftdn\" (UID: \"713c3460-f77d-4f7b-81bf-911f8f875dfe\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.936035 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/94401a93-55c7-4e8b-83f7-dc27a876f335-serviceca\") pod \"image-pruner-29521440-k6f7k\" (UID: \"94401a93-55c7-4e8b-83f7-dc27a876f335\") " pod="openshift-image-registry/image-pruner-29521440-k6f7k" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.936061 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.936092 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/713c3460-f77d-4f7b-81bf-911f8f875dfe-config\") pod \"controller-manager-879f6c89f-jftdn\" (UID: \"713c3460-f77d-4f7b-81bf-911f8f875dfe\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.936117 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/713c3460-f77d-4f7b-81bf-911f8f875dfe-serving-cert\") pod \"controller-manager-879f6c89f-jftdn\" (UID: \"713c3460-f77d-4f7b-81bf-911f8f875dfe\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.936157 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/643578b4-75ca-4765-8df5-9167688e3ced-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-sgzjl\" (UID: \"643578b4-75ca-4765-8df5-9167688e3ced\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.936194 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.936233 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.936298 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsnph\" (UniqueName: \"kubernetes.io/projected/0522c983-dae6-41ca-807a-ff45912a0024-kube-api-access-fsnph\") pod \"downloads-7954f5f757-rt865\" (UID: \"0522c983-dae6-41ca-807a-ff45912a0024\") " pod="openshift-console/downloads-7954f5f757-rt865" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.936847 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws66n\" (UniqueName: \"kubernetes.io/projected/c70fe9d3-348d-4bb8-89f7-21027041131a-kube-api-access-ws66n\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.936895 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.936921 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49kkg\" (UniqueName: \"kubernetes.io/projected/9c752f56-7754-4718-aea5-cb41d6ac4253-kube-api-access-49kkg\") pod \"machine-api-operator-5694c8668f-5nwz7\" (UID: \"9c752f56-7754-4718-aea5-cb41d6ac4253\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5nwz7" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.937581 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5526b957-e33f-4952-8dda-d2875c94686a-serving-cert\") pod \"etcd-operator-b45778765-stqb9\" (UID: \"5526b957-e33f-4952-8dda-d2875c94686a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stqb9" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.937691 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b1913d4-85d3-4596-acea-6e272cf81e8e-config\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.937752 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/643578b4-75ca-4765-8df5-9167688e3ced-etcd-client\") pod \"apiserver-7bbb656c7d-sgzjl\" (UID: \"643578b4-75ca-4765-8df5-9167688e3ced\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.937787 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/643578b4-75ca-4765-8df5-9167688e3ced-encryption-config\") pod \"apiserver-7bbb656c7d-sgzjl\" (UID: \"643578b4-75ca-4765-8df5-9167688e3ced\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.937852 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d15f89df-5eaf-48c8-963d-bc3e1c79bd43-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-49n75\" (UID: \"d15f89df-5eaf-48c8-963d-bc3e1c79bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-49n75" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.937976 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4360bf41-9e45-498e-8f94-2c43a0dc88e5-config\") pod \"console-operator-58897d9998-t5827\" (UID: \"4360bf41-9e45-498e-8f94-2c43a0dc88e5\") " pod="openshift-console-operator/console-operator-58897d9998-t5827" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.938008 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pgvc\" (UniqueName: \"kubernetes.io/projected/4360bf41-9e45-498e-8f94-2c43a0dc88e5-kube-api-access-8pgvc\") pod \"console-operator-58897d9998-t5827\" (UID: \"4360bf41-9e45-498e-8f94-2c43a0dc88e5\") " pod="openshift-console-operator/console-operator-58897d9998-t5827" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.938136 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5f00b345-a265-41cc-89b7-6f059fc4d5d1-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-qzgkr\" (UID: \"5f00b345-a265-41cc-89b7-6f059fc4d5d1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qzgkr" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.938208 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad62ba3d-c60a-4e1f-9768-187e74151f24-serving-cert\") pod \"openshift-config-operator-7777fb866f-qxk8k\" (UID: \"ad62ba3d-c60a-4e1f-9768-187e74151f24\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxk8k" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.938239 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b7f0fa93-740f-43aa-9350-24d9920a9345-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-fpr65\" (UID: \"b7f0fa93-740f-43aa-9350-24d9920a9345\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fpr65" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.938294 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1b1913d4-85d3-4596-acea-6e272cf81e8e-audit-dir\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.938440 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbqwz\" (UniqueName: \"kubernetes.io/projected/1b1913d4-85d3-4596-acea-6e272cf81e8e-kube-api-access-cbqwz\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.938440 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-n2s28"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.938484 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/155619c1-12ba-4149-9dce-474e3735168c-console-config\") pod \"console-f9d7485db-frmbv\" (UID: \"155619c1-12ba-4149-9dce-474e3735168c\") " pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.938541 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/643578b4-75ca-4765-8df5-9167688e3ced-audit-policies\") pod \"apiserver-7bbb656c7d-sgzjl\" (UID: \"643578b4-75ca-4765-8df5-9167688e3ced\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.938598 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/643578b4-75ca-4765-8df5-9167688e3ced-audit-dir\") pod \"apiserver-7bbb656c7d-sgzjl\" (UID: \"643578b4-75ca-4765-8df5-9167688e3ced\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.938659 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4ssg\" (UniqueName: \"kubernetes.io/projected/5f00b345-a265-41cc-89b7-6f059fc4d5d1-kube-api-access-f4ssg\") pod \"cluster-samples-operator-665b6dd947-qzgkr\" (UID: \"5f00b345-a265-41cc-89b7-6f059fc4d5d1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qzgkr" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.938696 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp6bb\" (UniqueName: \"kubernetes.io/projected/839b6744-bbe6-4b56-b020-181d86c604fe-kube-api-access-pp6bb\") pod \"machine-approver-56656f9798-d7qrh\" (UID: \"839b6744-bbe6-4b56-b020-181d86c604fe\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d7qrh" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.938737 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1b1913d4-85d3-4596-acea-6e272cf81e8e-node-pullsecrets\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.938777 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b1913d4-85d3-4596-acea-6e272cf81e8e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.938853 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4360bf41-9e45-498e-8f94-2c43a0dc88e5-trusted-ca\") pod \"console-operator-58897d9998-t5827\" (UID: \"4360bf41-9e45-498e-8f94-2c43a0dc88e5\") " pod="openshift-console-operator/console-operator-58897d9998-t5827" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.939024 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4360bf41-9e45-498e-8f94-2c43a0dc88e5-serving-cert\") pod \"console-operator-58897d9998-t5827\" (UID: \"4360bf41-9e45-498e-8f94-2c43a0dc88e5\") " pod="openshift-console-operator/console-operator-58897d9998-t5827" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.939191 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d15f89df-5eaf-48c8-963d-bc3e1c79bd43-serving-cert\") pod \"authentication-operator-69f744f599-49n75\" (UID: \"d15f89df-5eaf-48c8-963d-bc3e1c79bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-49n75" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.939256 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.939341 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a866a69-9159-4dd1-a03d-b2a0f703fb7b-config\") pod \"route-controller-manager-6576b87f9c-ht455\" (UID: \"6a866a69-9159-4dd1-a03d-b2a0f703fb7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.939367 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdjvm\" (UniqueName: \"kubernetes.io/projected/6a866a69-9159-4dd1-a03d-b2a0f703fb7b-kube-api-access-pdjvm\") pod \"route-controller-manager-6576b87f9c-ht455\" (UID: \"6a866a69-9159-4dd1-a03d-b2a0f703fb7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.939464 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bt9mf" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.939474 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b1913d4-85d3-4596-acea-6e272cf81e8e-serving-cert\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.939515 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.939708 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw65g\" (UniqueName: \"kubernetes.io/projected/ad62ba3d-c60a-4e1f-9768-187e74151f24-kube-api-access-vw65g\") pod \"openshift-config-operator-7777fb866f-qxk8k\" (UID: \"ad62ba3d-c60a-4e1f-9768-187e74151f24\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxk8k" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.939772 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a866a69-9159-4dd1-a03d-b2a0f703fb7b-serving-cert\") pod \"route-controller-manager-6576b87f9c-ht455\" (UID: \"6a866a69-9159-4dd1-a03d-b2a0f703fb7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.939832 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9z7g\" (UniqueName: \"kubernetes.io/projected/d15f89df-5eaf-48c8-963d-bc3e1c79bd43-kube-api-access-x9z7g\") pod \"authentication-operator-69f744f599-49n75\" (UID: \"d15f89df-5eaf-48c8-963d-bc3e1c79bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-49n75" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.939874 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4kmn\" (UniqueName: \"kubernetes.io/projected/5526b957-e33f-4952-8dda-d2875c94686a-kube-api-access-w4kmn\") pod \"etcd-operator-b45778765-stqb9\" (UID: \"5526b957-e33f-4952-8dda-d2875c94686a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stqb9" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.939912 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b7f0fa93-740f-43aa-9350-24d9920a9345-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-fpr65\" (UID: \"b7f0fa93-740f-43aa-9350-24d9920a9345\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fpr65" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.939935 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5526b957-e33f-4952-8dda-d2875c94686a-etcd-service-ca\") pod \"etcd-operator-b45778765-stqb9\" (UID: \"5526b957-e33f-4952-8dda-d2875c94686a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stqb9" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.940026 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/ad62ba3d-c60a-4e1f-9768-187e74151f24-available-featuregates\") pod \"openshift-config-operator-7777fb866f-qxk8k\" (UID: \"ad62ba3d-c60a-4e1f-9768-187e74151f24\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxk8k" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.940064 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/713c3460-f77d-4f7b-81bf-911f8f875dfe-client-ca\") pod \"controller-manager-879f6c89f-jftdn\" (UID: \"713c3460-f77d-4f7b-81bf-911f8f875dfe\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.940335 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.942872 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-rt865"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.942926 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5jsj6"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.945249 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-952sm"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.945276 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-5jsj6" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.945620 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n2s28" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.947341 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r2rv7"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.949039 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-952sm" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.952643 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-72t6m"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.953438 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r2rv7" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.953656 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521440-crl2x"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.954008 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-72t6m" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.955299 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521440-crl2x" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.955432 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-kt8q6"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.956498 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.959573 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.961155 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hb9dg"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.962044 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hb9dg" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.966580 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-xc8t7"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.968382 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xc8t7" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.968702 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bfffb"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.969366 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bfffb" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.971101 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.973916 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-49n75"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.976214 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.978453 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-qxk8k"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.979843 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29521440-k6f7k"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.981619 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qzgkr"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.984244 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-flvjk"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.986375 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-5bsn7"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.986964 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-5bsn7" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.987996 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-t5827"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.994216 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-frmbv"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.995947 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5jsj6"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.996248 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.996876 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nb5r"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.998295 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-vb8fv"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.999734 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57qch"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.000853 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.002088 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fpr65"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.003109 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-j7w5g"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.006223 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-n2s28"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.006510 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8lhkg"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.019420 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-tlpgd"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.020473 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-4chtt"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.020542 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.021281 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4chtt" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.021517 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-stqb9"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.022526 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r8zpf"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.023542 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kqr99"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.024559 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bt9mf"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.026064 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-952sm"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.027288 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-7pqm8"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.028402 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nlbr"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.029428 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wfpf2"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.030491 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-xc8t7"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.031577 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fpv5b"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.032696 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fs52s"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.034312 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhmqq"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.035990 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-72t6m"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.036308 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.037489 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-5nwz7"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.038765 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4chtt"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.039815 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-tlpgd"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.041038 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521440-crl2x"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.041368 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b1913d4-85d3-4596-acea-6e272cf81e8e-config\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.041496 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/643578b4-75ca-4765-8df5-9167688e3ced-etcd-client\") pod \"apiserver-7bbb656c7d-sgzjl\" (UID: \"643578b4-75ca-4765-8df5-9167688e3ced\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.041616 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4360bf41-9e45-498e-8f94-2c43a0dc88e5-config\") pod \"console-operator-58897d9998-t5827\" (UID: \"4360bf41-9e45-498e-8f94-2c43a0dc88e5\") " pod="openshift-console-operator/console-operator-58897d9998-t5827" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.041727 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pgvc\" (UniqueName: \"kubernetes.io/projected/4360bf41-9e45-498e-8f94-2c43a0dc88e5-kube-api-access-8pgvc\") pod \"console-operator-58897d9998-t5827\" (UID: \"4360bf41-9e45-498e-8f94-2c43a0dc88e5\") " pod="openshift-console-operator/console-operator-58897d9998-t5827" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.041852 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3032312-913c-4072-ac18-56fdc689cbac-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-fs52s\" (UID: \"e3032312-913c-4072-ac18-56fdc689cbac\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fs52s" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.042020 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d15f89df-5eaf-48c8-963d-bc3e1c79bd43-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-49n75\" (UID: \"d15f89df-5eaf-48c8-963d-bc3e1c79bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-49n75" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.042956 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b1913d4-85d3-4596-acea-6e272cf81e8e-config\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.043256 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1b1913d4-85d3-4596-acea-6e272cf81e8e-audit-dir\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.043339 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbqwz\" (UniqueName: \"kubernetes.io/projected/1b1913d4-85d3-4596-acea-6e272cf81e8e-kube-api-access-cbqwz\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.043374 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/643578b4-75ca-4765-8df5-9167688e3ced-audit-dir\") pod \"apiserver-7bbb656c7d-sgzjl\" (UID: \"643578b4-75ca-4765-8df5-9167688e3ced\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.043386 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1b1913d4-85d3-4596-acea-6e272cf81e8e-audit-dir\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.043395 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pp6bb\" (UniqueName: \"kubernetes.io/projected/839b6744-bbe6-4b56-b020-181d86c604fe-kube-api-access-pp6bb\") pod \"machine-approver-56656f9798-d7qrh\" (UID: \"839b6744-bbe6-4b56-b020-181d86c604fe\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d7qrh" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.043477 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b747aa6-3874-4f71-86bb-d340398d7bc4-config\") pod \"kube-apiserver-operator-766d6c64bb-8nb5r\" (UID: \"0b747aa6-3874-4f71-86bb-d340398d7bc4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nb5r" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.043518 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4ssg\" (UniqueName: \"kubernetes.io/projected/5f00b345-a265-41cc-89b7-6f059fc4d5d1-kube-api-access-f4ssg\") pod \"cluster-samples-operator-665b6dd947-qzgkr\" (UID: \"5f00b345-a265-41cc-89b7-6f059fc4d5d1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qzgkr" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.043554 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b1913d4-85d3-4596-acea-6e272cf81e8e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.043577 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d15f89df-5eaf-48c8-963d-bc3e1c79bd43-serving-cert\") pod \"authentication-operator-69f744f599-49n75\" (UID: \"d15f89df-5eaf-48c8-963d-bc3e1c79bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-49n75" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.043277 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4360bf41-9e45-498e-8f94-2c43a0dc88e5-config\") pod \"console-operator-58897d9998-t5827\" (UID: \"4360bf41-9e45-498e-8f94-2c43a0dc88e5\") " pod="openshift-console-operator/console-operator-58897d9998-t5827" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.043717 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/643578b4-75ca-4765-8df5-9167688e3ced-audit-dir\") pod \"apiserver-7bbb656c7d-sgzjl\" (UID: \"643578b4-75ca-4765-8df5-9167688e3ced\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.043839 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/03d7a8df-a8a3-4b34-bd28-d554ae70875a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-kqr99\" (UID: \"03d7a8df-a8a3-4b34-bd28-d554ae70875a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kqr99" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.043908 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b1913d4-85d3-4596-acea-6e272cf81e8e-serving-cert\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.043968 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a866a69-9159-4dd1-a03d-b2a0f703fb7b-serving-cert\") pod \"route-controller-manager-6576b87f9c-ht455\" (UID: \"6a866a69-9159-4dd1-a03d-b2a0f703fb7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.044003 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4kmn\" (UniqueName: \"kubernetes.io/projected/5526b957-e33f-4952-8dda-d2875c94686a-kube-api-access-w4kmn\") pod \"etcd-operator-b45778765-stqb9\" (UID: \"5526b957-e33f-4952-8dda-d2875c94686a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stqb9" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.044035 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b7f0fa93-740f-43aa-9350-24d9920a9345-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-fpr65\" (UID: \"b7f0fa93-740f-43aa-9350-24d9920a9345\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fpr65" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.044080 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b747aa6-3874-4f71-86bb-d340398d7bc4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-8nb5r\" (UID: \"0b747aa6-3874-4f71-86bb-d340398d7bc4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nb5r" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.044173 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/ad62ba3d-c60a-4e1f-9768-187e74151f24-available-featuregates\") pod \"openshift-config-operator-7777fb866f-qxk8k\" (UID: \"ad62ba3d-c60a-4e1f-9768-187e74151f24\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxk8k" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.044215 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5526b957-e33f-4952-8dda-d2875c94686a-etcd-service-ca\") pod \"etcd-operator-b45778765-stqb9\" (UID: \"5526b957-e33f-4952-8dda-d2875c94686a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stqb9" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.044250 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g6bf\" (UniqueName: \"kubernetes.io/projected/e3032312-913c-4072-ac18-56fdc689cbac-kube-api-access-8g6bf\") pod \"kube-storage-version-migrator-operator-b67b599dd-fs52s\" (UID: \"e3032312-913c-4072-ac18-56fdc689cbac\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fs52s" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.044293 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/459f3992-b770-44d7-9ecc-0ae8a228134f-stats-auth\") pod \"router-default-5444994796-kt8q6\" (UID: \"459f3992-b770-44d7-9ecc-0ae8a228134f\") " pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.044330 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5526b957-e33f-4952-8dda-d2875c94686a-config\") pod \"etcd-operator-b45778765-stqb9\" (UID: \"5526b957-e33f-4952-8dda-d2875c94686a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stqb9" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.044368 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/713c3460-f77d-4f7b-81bf-911f8f875dfe-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-jftdn\" (UID: \"713c3460-f77d-4f7b-81bf-911f8f875dfe\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.044437 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9c752f56-7754-4718-aea5-cb41d6ac4253-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-5nwz7\" (UID: \"9c752f56-7754-4718-aea5-cb41d6ac4253\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5nwz7" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.044467 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/643578b4-75ca-4765-8df5-9167688e3ced-serving-cert\") pod \"apiserver-7bbb656c7d-sgzjl\" (UID: \"643578b4-75ca-4765-8df5-9167688e3ced\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.044499 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5e2cf0a7-86f6-4858-98ad-08c4c644deb9-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fpv5b\" (UID: \"5e2cf0a7-86f6-4858-98ad-08c4c644deb9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fpv5b" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.044536 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.044600 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.044656 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbb6h\" (UniqueName: \"kubernetes.io/projected/03d7a8df-a8a3-4b34-bd28-d554ae70875a-kube-api-access-zbb6h\") pod \"control-plane-machine-set-operator-78cbb6b69f-kqr99\" (UID: \"03d7a8df-a8a3-4b34-bd28-d554ae70875a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kqr99" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.044737 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b747aa6-3874-4f71-86bb-d340398d7bc4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-8nb5r\" (UID: \"0b747aa6-3874-4f71-86bb-d340398d7bc4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nb5r" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.044762 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv2ts\" (UniqueName: \"kubernetes.io/projected/fe44c059-87ef-4805-b78f-b8c3cdfd844e-kube-api-access-rv2ts\") pod \"machine-config-operator-74547568cd-n2s28\" (UID: \"fe44c059-87ef-4805-b78f-b8c3cdfd844e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n2s28" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.044805 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2jpw\" (UniqueName: \"kubernetes.io/projected/9578978b-522d-48d8-9b08-384752fc49a1-kube-api-access-k2jpw\") pod \"service-ca-9c57cc56f-5jsj6\" (UID: \"9578978b-522d-48d8-9b08-384752fc49a1\") " pod="openshift-service-ca/service-ca-9c57cc56f-5jsj6" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.044850 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c752f56-7754-4718-aea5-cb41d6ac4253-config\") pod \"machine-api-operator-5694c8668f-5nwz7\" (UID: \"9c752f56-7754-4718-aea5-cb41d6ac4253\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5nwz7" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.044890 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/155619c1-12ba-4149-9dce-474e3735168c-console-serving-cert\") pod \"console-f9d7485db-frmbv\" (UID: \"155619c1-12ba-4149-9dce-474e3735168c\") " pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.044910 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5526b957-e33f-4952-8dda-d2875c94686a-etcd-ca\") pod \"etcd-operator-b45778765-stqb9\" (UID: \"5526b957-e33f-4952-8dda-d2875c94686a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stqb9" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.044950 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgt9z\" (UniqueName: \"kubernetes.io/projected/94401a93-55c7-4e8b-83f7-dc27a876f335-kube-api-access-cgt9z\") pod \"image-pruner-29521440-k6f7k\" (UID: \"94401a93-55c7-4e8b-83f7-dc27a876f335\") " pod="openshift-image-registry/image-pruner-29521440-k6f7k" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.044975 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/155619c1-12ba-4149-9dce-474e3735168c-console-oauth-config\") pod \"console-f9d7485db-frmbv\" (UID: \"155619c1-12ba-4149-9dce-474e3735168c\") " pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.045049 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b6c19ecc-0208-46de-8c03-6780bba30353-tmpfs\") pod \"packageserver-d55dfcdfc-bt9mf\" (UID: \"b6c19ecc-0208-46de-8c03-6780bba30353\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bt9mf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.045078 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d469ce1-e7ed-4826-a378-0de16f2b4e56-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-8lhkg\" (UID: \"3d469ce1-e7ed-4826-a378-0de16f2b4e56\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8lhkg" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.045100 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.045124 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gt7pk\" (UniqueName: \"kubernetes.io/projected/155619c1-12ba-4149-9dce-474e3735168c-kube-api-access-gt7pk\") pod \"console-f9d7485db-frmbv\" (UID: \"155619c1-12ba-4149-9dce-474e3735168c\") " pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.045158 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6a866a69-9159-4dd1-a03d-b2a0f703fb7b-client-ca\") pod \"route-controller-manager-6576b87f9c-ht455\" (UID: \"6a866a69-9159-4dd1-a03d-b2a0f703fb7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.045234 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b6c19ecc-0208-46de-8c03-6780bba30353-apiservice-cert\") pod \"packageserver-d55dfcdfc-bt9mf\" (UID: \"b6c19ecc-0208-46de-8c03-6780bba30353\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bt9mf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.045304 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e2cf0a7-86f6-4858-98ad-08c4c644deb9-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fpv5b\" (UID: \"5e2cf0a7-86f6-4858-98ad-08c4c644deb9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fpv5b" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.045327 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1b1913d4-85d3-4596-acea-6e272cf81e8e-image-import-ca\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.045395 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.045416 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/643578b4-75ca-4765-8df5-9167688e3ced-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-sgzjl\" (UID: \"643578b4-75ca-4765-8df5-9167688e3ced\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.045487 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3032312-913c-4072-ac18-56fdc689cbac-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-fs52s\" (UID: \"e3032312-913c-4072-ac18-56fdc689cbac\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fs52s" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.045526 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4qdh\" (UniqueName: \"kubernetes.io/projected/71967495-8841-4810-89e5-e114b9887c5e-kube-api-access-l4qdh\") pod \"package-server-manager-789f6589d5-r2rv7\" (UID: \"71967495-8841-4810-89e5-e114b9887c5e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r2rv7" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.045549 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/9578978b-522d-48d8-9b08-384752fc49a1-signing-key\") pod \"service-ca-9c57cc56f-5jsj6\" (UID: \"9578978b-522d-48d8-9b08-384752fc49a1\") " pod="openshift-service-ca/service-ca-9c57cc56f-5jsj6" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.045571 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1b1913d4-85d3-4596-acea-6e272cf81e8e-encryption-config\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.045594 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d15f89df-5eaf-48c8-963d-bc3e1c79bd43-config\") pod \"authentication-operator-69f744f599-49n75\" (UID: \"d15f89df-5eaf-48c8-963d-bc3e1c79bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-49n75" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.045623 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/155619c1-12ba-4149-9dce-474e3735168c-trusted-ca-bundle\") pod \"console-f9d7485db-frmbv\" (UID: \"155619c1-12ba-4149-9dce-474e3735168c\") " pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.045644 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/839b6744-bbe6-4b56-b020-181d86c604fe-machine-approver-tls\") pod \"machine-approver-56656f9798-d7qrh\" (UID: \"839b6744-bbe6-4b56-b020-181d86c604fe\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d7qrh" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.045669 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8zlc\" (UniqueName: \"kubernetes.io/projected/713c3460-f77d-4f7b-81bf-911f8f875dfe-kube-api-access-k8zlc\") pod \"controller-manager-879f6c89f-jftdn\" (UID: \"713c3460-f77d-4f7b-81bf-911f8f875dfe\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.045686 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/94401a93-55c7-4e8b-83f7-dc27a876f335-serviceca\") pod \"image-pruner-29521440-k6f7k\" (UID: \"94401a93-55c7-4e8b-83f7-dc27a876f335\") " pod="openshift-image-registry/image-pruner-29521440-k6f7k" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.045740 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/713c3460-f77d-4f7b-81bf-911f8f875dfe-config\") pod \"controller-manager-879f6c89f-jftdn\" (UID: \"713c3460-f77d-4f7b-81bf-911f8f875dfe\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.045775 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/643578b4-75ca-4765-8df5-9167688e3ced-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-sgzjl\" (UID: \"643578b4-75ca-4765-8df5-9167688e3ced\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.045800 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.045989 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qrvj\" (UniqueName: \"kubernetes.io/projected/459f3992-b770-44d7-9ecc-0ae8a228134f-kube-api-access-2qrvj\") pod \"router-default-5444994796-kt8q6\" (UID: \"459f3992-b770-44d7-9ecc-0ae8a228134f\") " pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.046097 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5526b957-e33f-4952-8dda-d2875c94686a-serving-cert\") pod \"etcd-operator-b45778765-stqb9\" (UID: \"5526b957-e33f-4952-8dda-d2875c94686a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stqb9" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.046182 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/643578b4-75ca-4765-8df5-9167688e3ced-encryption-config\") pod \"apiserver-7bbb656c7d-sgzjl\" (UID: \"643578b4-75ca-4765-8df5-9167688e3ced\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.046244 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svp66\" (UniqueName: \"kubernetes.io/projected/c50f96b4-3a86-4edc-b9d5-82fe3181b8a4-kube-api-access-svp66\") pod \"multus-admission-controller-857f4d67dd-7pqm8\" (UID: \"c50f96b4-3a86-4edc-b9d5-82fe3181b8a4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7pqm8" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.046274 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5f00b345-a265-41cc-89b7-6f059fc4d5d1-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-qzgkr\" (UID: \"5f00b345-a265-41cc-89b7-6f059fc4d5d1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qzgkr" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.046324 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad62ba3d-c60a-4e1f-9768-187e74151f24-serving-cert\") pod \"openshift-config-operator-7777fb866f-qxk8k\" (UID: \"ad62ba3d-c60a-4e1f-9768-187e74151f24\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxk8k" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.046389 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b7f0fa93-740f-43aa-9350-24d9920a9345-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-fpr65\" (UID: \"b7f0fa93-740f-43aa-9350-24d9920a9345\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fpr65" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.046426 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b6c19ecc-0208-46de-8c03-6780bba30353-webhook-cert\") pod \"packageserver-d55dfcdfc-bt9mf\" (UID: \"b6c19ecc-0208-46de-8c03-6780bba30353\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bt9mf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.046466 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/155619c1-12ba-4149-9dce-474e3735168c-console-config\") pod \"console-f9d7485db-frmbv\" (UID: \"155619c1-12ba-4149-9dce-474e3735168c\") " pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.046487 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/643578b4-75ca-4765-8df5-9167688e3ced-audit-policies\") pod \"apiserver-7bbb656c7d-sgzjl\" (UID: \"643578b4-75ca-4765-8df5-9167688e3ced\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.046511 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fe44c059-87ef-4805-b78f-b8c3cdfd844e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-n2s28\" (UID: \"fe44c059-87ef-4805-b78f-b8c3cdfd844e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n2s28" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.046537 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/9578978b-522d-48d8-9b08-384752fc49a1-signing-cabundle\") pod \"service-ca-9c57cc56f-5jsj6\" (UID: \"9578978b-522d-48d8-9b08-384752fc49a1\") " pod="openshift-service-ca/service-ca-9c57cc56f-5jsj6" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.046612 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4360bf41-9e45-498e-8f94-2c43a0dc88e5-trusted-ca\") pod \"console-operator-58897d9998-t5827\" (UID: \"4360bf41-9e45-498e-8f94-2c43a0dc88e5\") " pod="openshift-console-operator/console-operator-58897d9998-t5827" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.046641 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/459f3992-b770-44d7-9ecc-0ae8a228134f-default-certificate\") pod \"router-default-5444994796-kt8q6\" (UID: \"459f3992-b770-44d7-9ecc-0ae8a228134f\") " pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.046702 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c50f96b4-3a86-4edc-b9d5-82fe3181b8a4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-7pqm8\" (UID: \"c50f96b4-3a86-4edc-b9d5-82fe3181b8a4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7pqm8" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.046725 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1b1913d4-85d3-4596-acea-6e272cf81e8e-node-pullsecrets\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.046923 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.046945 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a866a69-9159-4dd1-a03d-b2a0f703fb7b-config\") pod \"route-controller-manager-6576b87f9c-ht455\" (UID: \"6a866a69-9159-4dd1-a03d-b2a0f703fb7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.046971 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdjvm\" (UniqueName: \"kubernetes.io/projected/6a866a69-9159-4dd1-a03d-b2a0f703fb7b-kube-api-access-pdjvm\") pod \"route-controller-manager-6576b87f9c-ht455\" (UID: \"6a866a69-9159-4dd1-a03d-b2a0f703fb7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.046995 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4360bf41-9e45-498e-8f94-2c43a0dc88e5-serving-cert\") pod \"console-operator-58897d9998-t5827\" (UID: \"4360bf41-9e45-498e-8f94-2c43a0dc88e5\") " pod="openshift-console-operator/console-operator-58897d9998-t5827" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047058 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw65g\" (UniqueName: \"kubernetes.io/projected/ad62ba3d-c60a-4e1f-9768-187e74151f24-kube-api-access-vw65g\") pod \"openshift-config-operator-7777fb866f-qxk8k\" (UID: \"ad62ba3d-c60a-4e1f-9768-187e74151f24\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxk8k" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047077 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047175 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9z7g\" (UniqueName: \"kubernetes.io/projected/d15f89df-5eaf-48c8-963d-bc3e1c79bd43-kube-api-access-x9z7g\") pod \"authentication-operator-69f744f599-49n75\" (UID: \"d15f89df-5eaf-48c8-963d-bc3e1c79bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-49n75" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047214 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/713c3460-f77d-4f7b-81bf-911f8f875dfe-client-ca\") pod \"controller-manager-879f6c89f-jftdn\" (UID: \"713c3460-f77d-4f7b-81bf-911f8f875dfe\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047246 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e2cf0a7-86f6-4858-98ad-08c4c644deb9-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fpv5b\" (UID: \"5e2cf0a7-86f6-4858-98ad-08c4c644deb9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fpv5b" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047281 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d469ce1-e7ed-4826-a378-0de16f2b4e56-config\") pod \"openshift-apiserver-operator-796bbdcf4f-8lhkg\" (UID: \"3d469ce1-e7ed-4826-a378-0de16f2b4e56\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8lhkg" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047314 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/155619c1-12ba-4149-9dce-474e3735168c-oauth-serving-cert\") pod \"console-f9d7485db-frmbv\" (UID: \"155619c1-12ba-4149-9dce-474e3735168c\") " pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047347 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/839b6744-bbe6-4b56-b020-181d86c604fe-auth-proxy-config\") pod \"machine-approver-56656f9798-d7qrh\" (UID: \"839b6744-bbe6-4b56-b020-181d86c604fe\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d7qrh" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047380 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1b1913d4-85d3-4596-acea-6e272cf81e8e-audit\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047382 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d15f89df-5eaf-48c8-963d-bc3e1c79bd43-serving-cert\") pod \"authentication-operator-69f744f599-49n75\" (UID: \"d15f89df-5eaf-48c8-963d-bc3e1c79bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-49n75" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047408 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvqmg\" (UniqueName: \"kubernetes.io/projected/ee8894f1-34ac-4df2-bf1a-01e1a110d6c9-kube-api-access-wvqmg\") pod \"dns-operator-744455d44c-72t6m\" (UID: \"ee8894f1-34ac-4df2-bf1a-01e1a110d6c9\") " pod="openshift-dns-operator/dns-operator-744455d44c-72t6m" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047457 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9pgd\" (UniqueName: \"kubernetes.io/projected/3d469ce1-e7ed-4826-a378-0de16f2b4e56-kube-api-access-m9pgd\") pod \"openshift-apiserver-operator-796bbdcf4f-8lhkg\" (UID: \"3d469ce1-e7ed-4826-a378-0de16f2b4e56\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8lhkg" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047486 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-966rv\" (UniqueName: \"kubernetes.io/projected/b6c19ecc-0208-46de-8c03-6780bba30353-kube-api-access-966rv\") pod \"packageserver-d55dfcdfc-bt9mf\" (UID: \"b6c19ecc-0208-46de-8c03-6780bba30353\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bt9mf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047513 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/459f3992-b770-44d7-9ecc-0ae8a228134f-service-ca-bundle\") pod \"router-default-5444994796-kt8q6\" (UID: \"459f3992-b770-44d7-9ecc-0ae8a228134f\") " pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047540 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1b1913d4-85d3-4596-acea-6e272cf81e8e-etcd-serving-ca\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047571 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5526b957-e33f-4952-8dda-d2875c94686a-etcd-client\") pod \"etcd-operator-b45778765-stqb9\" (UID: \"5526b957-e33f-4952-8dda-d2875c94686a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stqb9" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047608 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fe44c059-87ef-4805-b78f-b8c3cdfd844e-images\") pod \"machine-config-operator-74547568cd-n2s28\" (UID: \"fe44c059-87ef-4805-b78f-b8c3cdfd844e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n2s28" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047642 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnsxm\" (UniqueName: \"kubernetes.io/projected/b7f0fa93-740f-43aa-9350-24d9920a9345-kube-api-access-gnsxm\") pod \"cluster-image-registry-operator-dc59b4c8b-fpr65\" (UID: \"b7f0fa93-740f-43aa-9350-24d9920a9345\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fpr65" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047670 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5ph6\" (UniqueName: \"kubernetes.io/projected/c5fb65f7-7cc6-4834-853e-a91eebc956fd-kube-api-access-r5ph6\") pod \"migrator-59844c95c7-952sm\" (UID: \"c5fb65f7-7cc6-4834-853e-a91eebc956fd\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-952sm" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047700 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1b1913d4-85d3-4596-acea-6e272cf81e8e-etcd-client\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047735 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9c752f56-7754-4718-aea5-cb41d6ac4253-images\") pod \"machine-api-operator-5694c8668f-5nwz7\" (UID: \"9c752f56-7754-4718-aea5-cb41d6ac4253\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5nwz7" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047754 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a866a69-9159-4dd1-a03d-b2a0f703fb7b-serving-cert\") pod \"route-controller-manager-6576b87f9c-ht455\" (UID: \"6a866a69-9159-4dd1-a03d-b2a0f703fb7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047768 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b7f0fa93-740f-43aa-9350-24d9920a9345-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-fpr65\" (UID: \"b7f0fa93-740f-43aa-9350-24d9920a9345\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fpr65" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047806 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/459f3992-b770-44d7-9ecc-0ae8a228134f-metrics-certs\") pod \"router-default-5444994796-kt8q6\" (UID: \"459f3992-b770-44d7-9ecc-0ae8a228134f\") " pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047849 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047876 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/839b6744-bbe6-4b56-b020-181d86c604fe-config\") pod \"machine-approver-56656f9798-d7qrh\" (UID: \"839b6744-bbe6-4b56-b020-181d86c604fe\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d7qrh" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047909 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c70fe9d3-348d-4bb8-89f7-21027041131a-audit-policies\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047930 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/155619c1-12ba-4149-9dce-474e3735168c-service-ca\") pod \"console-f9d7485db-frmbv\" (UID: \"155619c1-12ba-4149-9dce-474e3735168c\") " pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047952 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8kss\" (UniqueName: \"kubernetes.io/projected/643578b4-75ca-4765-8df5-9167688e3ced-kube-api-access-x8kss\") pod \"apiserver-7bbb656c7d-sgzjl\" (UID: \"643578b4-75ca-4765-8df5-9167688e3ced\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047970 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ee8894f1-34ac-4df2-bf1a-01e1a110d6c9-metrics-tls\") pod \"dns-operator-744455d44c-72t6m\" (UID: \"ee8894f1-34ac-4df2-bf1a-01e1a110d6c9\") " pod="openshift-dns-operator/dns-operator-744455d44c-72t6m" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047993 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c70fe9d3-348d-4bb8-89f7-21027041131a-audit-dir\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.048046 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fe44c059-87ef-4805-b78f-b8c3cdfd844e-proxy-tls\") pod \"machine-config-operator-74547568cd-n2s28\" (UID: \"fe44c059-87ef-4805-b78f-b8c3cdfd844e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n2s28" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.048072 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d15f89df-5eaf-48c8-963d-bc3e1c79bd43-service-ca-bundle\") pod \"authentication-operator-69f744f599-49n75\" (UID: \"d15f89df-5eaf-48c8-963d-bc3e1c79bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-49n75" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.048097 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.048121 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/713c3460-f77d-4f7b-81bf-911f8f875dfe-serving-cert\") pod \"controller-manager-879f6c89f-jftdn\" (UID: \"713c3460-f77d-4f7b-81bf-911f8f875dfe\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.048158 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.048182 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsnph\" (UniqueName: \"kubernetes.io/projected/0522c983-dae6-41ca-807a-ff45912a0024-kube-api-access-fsnph\") pod \"downloads-7954f5f757-rt865\" (UID: \"0522c983-dae6-41ca-807a-ff45912a0024\") " pod="openshift-console/downloads-7954f5f757-rt865" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.048215 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws66n\" (UniqueName: \"kubernetes.io/projected/c70fe9d3-348d-4bb8-89f7-21027041131a-kube-api-access-ws66n\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.048249 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.048271 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49kkg\" (UniqueName: \"kubernetes.io/projected/9c752f56-7754-4718-aea5-cb41d6ac4253-kube-api-access-49kkg\") pod \"machine-api-operator-5694c8668f-5nwz7\" (UID: \"9c752f56-7754-4718-aea5-cb41d6ac4253\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5nwz7" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.048294 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/71967495-8841-4810-89e5-e114b9887c5e-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-r2rv7\" (UID: \"71967495-8841-4810-89e5-e114b9887c5e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r2rv7" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.048470 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/643578b4-75ca-4765-8df5-9167688e3ced-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-sgzjl\" (UID: \"643578b4-75ca-4765-8df5-9167688e3ced\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.048603 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b1913d4-85d3-4596-acea-6e272cf81e8e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.049165 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/ad62ba3d-c60a-4e1f-9768-187e74151f24-available-featuregates\") pod \"openshift-config-operator-7777fb866f-qxk8k\" (UID: \"ad62ba3d-c60a-4e1f-9768-187e74151f24\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxk8k" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.049230 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5526b957-e33f-4952-8dda-d2875c94686a-etcd-service-ca\") pod \"etcd-operator-b45778765-stqb9\" (UID: \"5526b957-e33f-4952-8dda-d2875c94686a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stqb9" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.049598 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hb9dg"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.049647 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r2rv7"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.050554 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b1913d4-85d3-4596-acea-6e272cf81e8e-serving-cert\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.050623 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bfffb"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.052459 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d15f89df-5eaf-48c8-963d-bc3e1c79bd43-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-49n75\" (UID: \"d15f89df-5eaf-48c8-963d-bc3e1c79bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-49n75" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.053353 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5526b957-e33f-4952-8dda-d2875c94686a-config\") pod \"etcd-operator-b45778765-stqb9\" (UID: \"5526b957-e33f-4952-8dda-d2875c94686a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stqb9" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.053809 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5526b957-e33f-4952-8dda-d2875c94686a-etcd-ca\") pod \"etcd-operator-b45778765-stqb9\" (UID: \"5526b957-e33f-4952-8dda-d2875c94686a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stqb9" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.055504 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.055654 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.055957 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-946wq"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.056365 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c752f56-7754-4718-aea5-cb41d6ac4253-config\") pod \"machine-api-operator-5694c8668f-5nwz7\" (UID: \"9c752f56-7754-4718-aea5-cb41d6ac4253\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5nwz7" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.056839 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.057302 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/713c3460-f77d-4f7b-81bf-911f8f875dfe-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-jftdn\" (UID: \"713c3460-f77d-4f7b-81bf-911f8f875dfe\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.057561 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a866a69-9159-4dd1-a03d-b2a0f703fb7b-config\") pod \"route-controller-manager-6576b87f9c-ht455\" (UID: \"6a866a69-9159-4dd1-a03d-b2a0f703fb7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.058620 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.058796 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-946wq"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.058795 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9c752f56-7754-4718-aea5-cb41d6ac4253-images\") pod \"machine-api-operator-5694c8668f-5nwz7\" (UID: \"9c752f56-7754-4718-aea5-cb41d6ac4253\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5nwz7" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.058908 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-946wq" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.059161 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/643578b4-75ca-4765-8df5-9167688e3ced-etcd-client\") pod \"apiserver-7bbb656c7d-sgzjl\" (UID: \"643578b4-75ca-4765-8df5-9167688e3ced\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.059700 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d15f89df-5eaf-48c8-963d-bc3e1c79bd43-config\") pod \"authentication-operator-69f744f599-49n75\" (UID: \"d15f89df-5eaf-48c8-963d-bc3e1c79bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-49n75" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.060368 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/643578b4-75ca-4765-8df5-9167688e3ced-encryption-config\") pod \"apiserver-7bbb656c7d-sgzjl\" (UID: \"643578b4-75ca-4765-8df5-9167688e3ced\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.060386 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1b1913d4-85d3-4596-acea-6e272cf81e8e-image-import-ca\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.060524 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1b1913d4-85d3-4596-acea-6e272cf81e8e-etcd-serving-ca\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.060591 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/643578b4-75ca-4765-8df5-9167688e3ced-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-sgzjl\" (UID: \"643578b4-75ca-4765-8df5-9167688e3ced\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.060561 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5526b957-e33f-4952-8dda-d2875c94686a-serving-cert\") pod \"etcd-operator-b45778765-stqb9\" (UID: \"5526b957-e33f-4952-8dda-d2875c94686a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stqb9" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.060676 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1b1913d4-85d3-4596-acea-6e272cf81e8e-node-pullsecrets\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.061067 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/643578b4-75ca-4765-8df5-9167688e3ced-serving-cert\") pod \"apiserver-7bbb656c7d-sgzjl\" (UID: \"643578b4-75ca-4765-8df5-9167688e3ced\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.061301 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/155619c1-12ba-4149-9dce-474e3735168c-console-oauth-config\") pod \"console-f9d7485db-frmbv\" (UID: \"155619c1-12ba-4149-9dce-474e3735168c\") " pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.061460 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b7f0fa93-740f-43aa-9350-24d9920a9345-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-fpr65\" (UID: \"b7f0fa93-740f-43aa-9350-24d9920a9345\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fpr65" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.061536 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5f00b345-a265-41cc-89b7-6f059fc4d5d1-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-qzgkr\" (UID: \"5f00b345-a265-41cc-89b7-6f059fc4d5d1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qzgkr" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.062055 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6a866a69-9159-4dd1-a03d-b2a0f703fb7b-client-ca\") pod \"route-controller-manager-6576b87f9c-ht455\" (UID: \"6a866a69-9159-4dd1-a03d-b2a0f703fb7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.062214 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c70fe9d3-348d-4bb8-89f7-21027041131a-audit-dir\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.062333 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1b1913d4-85d3-4596-acea-6e272cf81e8e-encryption-config\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.062730 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/839b6744-bbe6-4b56-b020-181d86c604fe-config\") pod \"machine-approver-56656f9798-d7qrh\" (UID: \"839b6744-bbe6-4b56-b020-181d86c604fe\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d7qrh" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.062940 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d469ce1-e7ed-4826-a378-0de16f2b4e56-config\") pod \"openshift-apiserver-operator-796bbdcf4f-8lhkg\" (UID: \"3d469ce1-e7ed-4826-a378-0de16f2b4e56\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8lhkg" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.063169 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/155619c1-12ba-4149-9dce-474e3735168c-console-config\") pod \"console-f9d7485db-frmbv\" (UID: \"155619c1-12ba-4149-9dce-474e3735168c\") " pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.063288 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/94401a93-55c7-4e8b-83f7-dc27a876f335-serviceca\") pod \"image-pruner-29521440-k6f7k\" (UID: \"94401a93-55c7-4e8b-83f7-dc27a876f335\") " pod="openshift-image-registry/image-pruner-29521440-k6f7k" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.063322 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.063430 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/155619c1-12ba-4149-9dce-474e3735168c-trusted-ca-bundle\") pod \"console-f9d7485db-frmbv\" (UID: \"155619c1-12ba-4149-9dce-474e3735168c\") " pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.063648 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9c752f56-7754-4718-aea5-cb41d6ac4253-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-5nwz7\" (UID: \"9c752f56-7754-4718-aea5-cb41d6ac4253\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5nwz7" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.063734 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/643578b4-75ca-4765-8df5-9167688e3ced-audit-policies\") pod \"apiserver-7bbb656c7d-sgzjl\" (UID: \"643578b4-75ca-4765-8df5-9167688e3ced\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.064486 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/839b6744-bbe6-4b56-b020-181d86c604fe-auth-proxy-config\") pod \"machine-approver-56656f9798-d7qrh\" (UID: \"839b6744-bbe6-4b56-b020-181d86c604fe\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d7qrh" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.064629 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d469ce1-e7ed-4826-a378-0de16f2b4e56-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-8lhkg\" (UID: \"3d469ce1-e7ed-4826-a378-0de16f2b4e56\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8lhkg" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.064627 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1b1913d4-85d3-4596-acea-6e272cf81e8e-etcd-client\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.064836 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/155619c1-12ba-4149-9dce-474e3735168c-oauth-serving-cert\") pod \"console-f9d7485db-frmbv\" (UID: \"155619c1-12ba-4149-9dce-474e3735168c\") " pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.064837 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c70fe9d3-348d-4bb8-89f7-21027041131a-audit-policies\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.065051 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.065672 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5526b957-e33f-4952-8dda-d2875c94686a-etcd-client\") pod \"etcd-operator-b45778765-stqb9\" (UID: \"5526b957-e33f-4952-8dda-d2875c94686a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stqb9" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.066078 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1b1913d4-85d3-4596-acea-6e272cf81e8e-audit\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.066111 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/155619c1-12ba-4149-9dce-474e3735168c-service-ca\") pod \"console-f9d7485db-frmbv\" (UID: \"155619c1-12ba-4149-9dce-474e3735168c\") " pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.066383 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/713c3460-f77d-4f7b-81bf-911f8f875dfe-client-ca\") pod \"controller-manager-879f6c89f-jftdn\" (UID: \"713c3460-f77d-4f7b-81bf-911f8f875dfe\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.066470 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/713c3460-f77d-4f7b-81bf-911f8f875dfe-config\") pod \"controller-manager-879f6c89f-jftdn\" (UID: \"713c3460-f77d-4f7b-81bf-911f8f875dfe\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.066484 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b7f0fa93-740f-43aa-9350-24d9920a9345-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-fpr65\" (UID: \"b7f0fa93-740f-43aa-9350-24d9920a9345\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fpr65" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.067022 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d15f89df-5eaf-48c8-963d-bc3e1c79bd43-service-ca-bundle\") pod \"authentication-operator-69f744f599-49n75\" (UID: \"d15f89df-5eaf-48c8-963d-bc3e1c79bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-49n75" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.067439 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/839b6744-bbe6-4b56-b020-181d86c604fe-machine-approver-tls\") pod \"machine-approver-56656f9798-d7qrh\" (UID: \"839b6744-bbe6-4b56-b020-181d86c604fe\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d7qrh" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.067471 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.067704 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4360bf41-9e45-498e-8f94-2c43a0dc88e5-trusted-ca\") pod \"console-operator-58897d9998-t5827\" (UID: \"4360bf41-9e45-498e-8f94-2c43a0dc88e5\") " pod="openshift-console-operator/console-operator-58897d9998-t5827" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.067986 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.068358 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.068548 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4360bf41-9e45-498e-8f94-2c43a0dc88e5-serving-cert\") pod \"console-operator-58897d9998-t5827\" (UID: \"4360bf41-9e45-498e-8f94-2c43a0dc88e5\") " pod="openshift-console-operator/console-operator-58897d9998-t5827" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.069684 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/713c3460-f77d-4f7b-81bf-911f8f875dfe-serving-cert\") pod \"controller-manager-879f6c89f-jftdn\" (UID: \"713c3460-f77d-4f7b-81bf-911f8f875dfe\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.069886 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.070381 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/155619c1-12ba-4149-9dce-474e3735168c-console-serving-cert\") pod \"console-f9d7485db-frmbv\" (UID: \"155619c1-12ba-4149-9dce-474e3735168c\") " pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.070462 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.070558 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.072680 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad62ba3d-c60a-4e1f-9768-187e74151f24-serving-cert\") pod \"openshift-config-operator-7777fb866f-qxk8k\" (UID: \"ad62ba3d-c60a-4e1f-9768-187e74151f24\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxk8k" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.077104 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.096201 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.116364 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.136384 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.149190 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/03d7a8df-a8a3-4b34-bd28-d554ae70875a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-kqr99\" (UID: \"03d7a8df-a8a3-4b34-bd28-d554ae70875a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kqr99" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.149228 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b747aa6-3874-4f71-86bb-d340398d7bc4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-8nb5r\" (UID: \"0b747aa6-3874-4f71-86bb-d340398d7bc4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nb5r" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.149249 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g6bf\" (UniqueName: \"kubernetes.io/projected/e3032312-913c-4072-ac18-56fdc689cbac-kube-api-access-8g6bf\") pod \"kube-storage-version-migrator-operator-b67b599dd-fs52s\" (UID: \"e3032312-913c-4072-ac18-56fdc689cbac\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fs52s" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.149280 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/459f3992-b770-44d7-9ecc-0ae8a228134f-stats-auth\") pod \"router-default-5444994796-kt8q6\" (UID: \"459f3992-b770-44d7-9ecc-0ae8a228134f\") " pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.149306 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5e2cf0a7-86f6-4858-98ad-08c4c644deb9-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fpv5b\" (UID: \"5e2cf0a7-86f6-4858-98ad-08c4c644deb9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fpv5b" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.149335 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbb6h\" (UniqueName: \"kubernetes.io/projected/03d7a8df-a8a3-4b34-bd28-d554ae70875a-kube-api-access-zbb6h\") pod \"control-plane-machine-set-operator-78cbb6b69f-kqr99\" (UID: \"03d7a8df-a8a3-4b34-bd28-d554ae70875a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kqr99" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.149377 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b747aa6-3874-4f71-86bb-d340398d7bc4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-8nb5r\" (UID: \"0b747aa6-3874-4f71-86bb-d340398d7bc4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nb5r" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.149399 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv2ts\" (UniqueName: \"kubernetes.io/projected/fe44c059-87ef-4805-b78f-b8c3cdfd844e-kube-api-access-rv2ts\") pod \"machine-config-operator-74547568cd-n2s28\" (UID: \"fe44c059-87ef-4805-b78f-b8c3cdfd844e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n2s28" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.149421 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2jpw\" (UniqueName: \"kubernetes.io/projected/9578978b-522d-48d8-9b08-384752fc49a1-kube-api-access-k2jpw\") pod \"service-ca-9c57cc56f-5jsj6\" (UID: \"9578978b-522d-48d8-9b08-384752fc49a1\") " pod="openshift-service-ca/service-ca-9c57cc56f-5jsj6" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.149479 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b6c19ecc-0208-46de-8c03-6780bba30353-tmpfs\") pod \"packageserver-d55dfcdfc-bt9mf\" (UID: \"b6c19ecc-0208-46de-8c03-6780bba30353\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bt9mf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.149511 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b6c19ecc-0208-46de-8c03-6780bba30353-apiservice-cert\") pod \"packageserver-d55dfcdfc-bt9mf\" (UID: \"b6c19ecc-0208-46de-8c03-6780bba30353\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bt9mf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.149553 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e2cf0a7-86f6-4858-98ad-08c4c644deb9-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fpv5b\" (UID: \"5e2cf0a7-86f6-4858-98ad-08c4c644deb9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fpv5b" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.149571 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3032312-913c-4072-ac18-56fdc689cbac-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-fs52s\" (UID: \"e3032312-913c-4072-ac18-56fdc689cbac\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fs52s" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.149630 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4qdh\" (UniqueName: \"kubernetes.io/projected/71967495-8841-4810-89e5-e114b9887c5e-kube-api-access-l4qdh\") pod \"package-server-manager-789f6589d5-r2rv7\" (UID: \"71967495-8841-4810-89e5-e114b9887c5e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r2rv7" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.149699 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/9578978b-522d-48d8-9b08-384752fc49a1-signing-key\") pod \"service-ca-9c57cc56f-5jsj6\" (UID: \"9578978b-522d-48d8-9b08-384752fc49a1\") " pod="openshift-service-ca/service-ca-9c57cc56f-5jsj6" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.149737 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qrvj\" (UniqueName: \"kubernetes.io/projected/459f3992-b770-44d7-9ecc-0ae8a228134f-kube-api-access-2qrvj\") pod \"router-default-5444994796-kt8q6\" (UID: \"459f3992-b770-44d7-9ecc-0ae8a228134f\") " pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.149788 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svp66\" (UniqueName: \"kubernetes.io/projected/c50f96b4-3a86-4edc-b9d5-82fe3181b8a4-kube-api-access-svp66\") pod \"multus-admission-controller-857f4d67dd-7pqm8\" (UID: \"c50f96b4-3a86-4edc-b9d5-82fe3181b8a4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7pqm8" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.149811 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b6c19ecc-0208-46de-8c03-6780bba30353-webhook-cert\") pod \"packageserver-d55dfcdfc-bt9mf\" (UID: \"b6c19ecc-0208-46de-8c03-6780bba30353\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bt9mf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.149827 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fe44c059-87ef-4805-b78f-b8c3cdfd844e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-n2s28\" (UID: \"fe44c059-87ef-4805-b78f-b8c3cdfd844e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n2s28" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.149981 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/9578978b-522d-48d8-9b08-384752fc49a1-signing-cabundle\") pod \"service-ca-9c57cc56f-5jsj6\" (UID: \"9578978b-522d-48d8-9b08-384752fc49a1\") " pod="openshift-service-ca/service-ca-9c57cc56f-5jsj6" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.150059 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/459f3992-b770-44d7-9ecc-0ae8a228134f-default-certificate\") pod \"router-default-5444994796-kt8q6\" (UID: \"459f3992-b770-44d7-9ecc-0ae8a228134f\") " pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.150090 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c50f96b4-3a86-4edc-b9d5-82fe3181b8a4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-7pqm8\" (UID: \"c50f96b4-3a86-4edc-b9d5-82fe3181b8a4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7pqm8" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.150191 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e2cf0a7-86f6-4858-98ad-08c4c644deb9-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fpv5b\" (UID: \"5e2cf0a7-86f6-4858-98ad-08c4c644deb9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fpv5b" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.150271 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvqmg\" (UniqueName: \"kubernetes.io/projected/ee8894f1-34ac-4df2-bf1a-01e1a110d6c9-kube-api-access-wvqmg\") pod \"dns-operator-744455d44c-72t6m\" (UID: \"ee8894f1-34ac-4df2-bf1a-01e1a110d6c9\") " pod="openshift-dns-operator/dns-operator-744455d44c-72t6m" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.150332 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-966rv\" (UniqueName: \"kubernetes.io/projected/b6c19ecc-0208-46de-8c03-6780bba30353-kube-api-access-966rv\") pod \"packageserver-d55dfcdfc-bt9mf\" (UID: \"b6c19ecc-0208-46de-8c03-6780bba30353\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bt9mf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.150356 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/459f3992-b770-44d7-9ecc-0ae8a228134f-service-ca-bundle\") pod \"router-default-5444994796-kt8q6\" (UID: \"459f3992-b770-44d7-9ecc-0ae8a228134f\") " pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.150412 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fe44c059-87ef-4805-b78f-b8c3cdfd844e-images\") pod \"machine-config-operator-74547568cd-n2s28\" (UID: \"fe44c059-87ef-4805-b78f-b8c3cdfd844e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n2s28" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.150445 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5ph6\" (UniqueName: \"kubernetes.io/projected/c5fb65f7-7cc6-4834-853e-a91eebc956fd-kube-api-access-r5ph6\") pod \"migrator-59844c95c7-952sm\" (UID: \"c5fb65f7-7cc6-4834-853e-a91eebc956fd\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-952sm" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.150503 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/459f3992-b770-44d7-9ecc-0ae8a228134f-metrics-certs\") pod \"router-default-5444994796-kt8q6\" (UID: \"459f3992-b770-44d7-9ecc-0ae8a228134f\") " pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.150541 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ee8894f1-34ac-4df2-bf1a-01e1a110d6c9-metrics-tls\") pod \"dns-operator-744455d44c-72t6m\" (UID: \"ee8894f1-34ac-4df2-bf1a-01e1a110d6c9\") " pod="openshift-dns-operator/dns-operator-744455d44c-72t6m" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.150568 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fe44c059-87ef-4805-b78f-b8c3cdfd844e-proxy-tls\") pod \"machine-config-operator-74547568cd-n2s28\" (UID: \"fe44c059-87ef-4805-b78f-b8c3cdfd844e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n2s28" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.150648 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/71967495-8841-4810-89e5-e114b9887c5e-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-r2rv7\" (UID: \"71967495-8841-4810-89e5-e114b9887c5e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r2rv7" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.150694 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3032312-913c-4072-ac18-56fdc689cbac-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-fs52s\" (UID: \"e3032312-913c-4072-ac18-56fdc689cbac\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fs52s" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.150749 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b747aa6-3874-4f71-86bb-d340398d7bc4-config\") pod \"kube-apiserver-operator-766d6c64bb-8nb5r\" (UID: \"0b747aa6-3874-4f71-86bb-d340398d7bc4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nb5r" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.150926 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fe44c059-87ef-4805-b78f-b8c3cdfd844e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-n2s28\" (UID: \"fe44c059-87ef-4805-b78f-b8c3cdfd844e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n2s28" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.151016 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b6c19ecc-0208-46de-8c03-6780bba30353-tmpfs\") pod \"packageserver-d55dfcdfc-bt9mf\" (UID: \"b6c19ecc-0208-46de-8c03-6780bba30353\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bt9mf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.163785 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.177527 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.197382 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.216349 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.224005 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e2cf0a7-86f6-4858-98ad-08c4c644deb9-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fpv5b\" (UID: \"5e2cf0a7-86f6-4858-98ad-08c4c644deb9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fpv5b" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.239207 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.256663 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.260187 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e2cf0a7-86f6-4858-98ad-08c4c644deb9-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fpv5b\" (UID: \"5e2cf0a7-86f6-4858-98ad-08c4c644deb9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fpv5b" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.278489 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.297641 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.318896 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.338311 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.343359 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3032312-913c-4072-ac18-56fdc689cbac-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-fs52s\" (UID: \"e3032312-913c-4072-ac18-56fdc689cbac\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fs52s" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.356992 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.361880 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3032312-913c-4072-ac18-56fdc689cbac-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-fs52s\" (UID: \"e3032312-913c-4072-ac18-56fdc689cbac\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fs52s" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.377183 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.396360 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.418254 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.436942 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.456925 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.462450 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b747aa6-3874-4f71-86bb-d340398d7bc4-config\") pod \"kube-apiserver-operator-766d6c64bb-8nb5r\" (UID: \"0b747aa6-3874-4f71-86bb-d340398d7bc4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nb5r" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.477928 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.497259 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.504776 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b747aa6-3874-4f71-86bb-d340398d7bc4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-8nb5r\" (UID: \"0b747aa6-3874-4f71-86bb-d340398d7bc4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nb5r" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.516955 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.537966 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.557520 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.577399 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.598175 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.617508 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.636781 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.657739 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.664867 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/03d7a8df-a8a3-4b34-bd28-d554ae70875a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-kqr99\" (UID: \"03d7a8df-a8a3-4b34-bd28-d554ae70875a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kqr99" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.677442 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.697666 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.706397 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c50f96b4-3a86-4edc-b9d5-82fe3181b8a4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-7pqm8\" (UID: \"c50f96b4-3a86-4edc-b9d5-82fe3181b8a4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7pqm8" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.718164 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.736647 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.757243 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.777456 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.797079 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.817627 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.837608 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.856733 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.865502 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b6c19ecc-0208-46de-8c03-6780bba30353-webhook-cert\") pod \"packageserver-d55dfcdfc-bt9mf\" (UID: \"b6c19ecc-0208-46de-8c03-6780bba30353\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bt9mf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.865707 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b6c19ecc-0208-46de-8c03-6780bba30353-apiservice-cert\") pod \"packageserver-d55dfcdfc-bt9mf\" (UID: \"b6c19ecc-0208-46de-8c03-6780bba30353\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bt9mf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.877876 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.896753 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.904061 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fe44c059-87ef-4805-b78f-b8c3cdfd844e-images\") pod \"machine-config-operator-74547568cd-n2s28\" (UID: \"fe44c059-87ef-4805-b78f-b8c3cdfd844e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n2s28" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.917481 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.925359 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/9578978b-522d-48d8-9b08-384752fc49a1-signing-key\") pod \"service-ca-9c57cc56f-5jsj6\" (UID: \"9578978b-522d-48d8-9b08-384752fc49a1\") " pod="openshift-service-ca/service-ca-9c57cc56f-5jsj6" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.936986 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.955814 4791 request.go:700] Waited for 1.009901131s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-service-ca/configmaps?fieldSelector=metadata.name%3Dsigning-cabundle&limit=500&resourceVersion=0 Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.957937 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.962399 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/9578978b-522d-48d8-9b08-384752fc49a1-signing-cabundle\") pod \"service-ca-9c57cc56f-5jsj6\" (UID: \"9578978b-522d-48d8-9b08-384752fc49a1\") " pod="openshift-service-ca/service-ca-9c57cc56f-5jsj6" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.977742 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.998365 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.017921 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.025989 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fe44c059-87ef-4805-b78f-b8c3cdfd844e-proxy-tls\") pod \"machine-config-operator-74547568cd-n2s28\" (UID: \"fe44c059-87ef-4805-b78f-b8c3cdfd844e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n2s28" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.037669 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.057523 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.077563 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.096448 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.107117 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/71967495-8841-4810-89e5-e114b9887c5e-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-r2rv7\" (UID: \"71967495-8841-4810-89e5-e114b9887c5e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r2rv7" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.117619 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.126264 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ee8894f1-34ac-4df2-bf1a-01e1a110d6c9-metrics-tls\") pod \"dns-operator-744455d44c-72t6m\" (UID: \"ee8894f1-34ac-4df2-bf1a-01e1a110d6c9\") " pod="openshift-dns-operator/dns-operator-744455d44c-72t6m" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.136898 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 17 00:08:03 crc kubenswrapper[4791]: E0217 00:08:03.149914 4791 secret.go:188] Couldn't get secret openshift-ingress/router-stats-default: failed to sync secret cache: timed out waiting for the condition Feb 17 00:08:03 crc kubenswrapper[4791]: E0217 00:08:03.150045 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/459f3992-b770-44d7-9ecc-0ae8a228134f-stats-auth podName:459f3992-b770-44d7-9ecc-0ae8a228134f nodeName:}" failed. No retries permitted until 2026-02-17 00:08:03.650017401 +0000 UTC m=+141.129530038 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "stats-auth" (UniqueName: "kubernetes.io/secret/459f3992-b770-44d7-9ecc-0ae8a228134f-stats-auth") pod "router-default-5444994796-kt8q6" (UID: "459f3992-b770-44d7-9ecc-0ae8a228134f") : failed to sync secret cache: timed out waiting for the condition Feb 17 00:08:03 crc kubenswrapper[4791]: E0217 00:08:03.152658 4791 configmap.go:193] Couldn't get configMap openshift-ingress/service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Feb 17 00:08:03 crc kubenswrapper[4791]: E0217 00:08:03.152722 4791 secret.go:188] Couldn't get secret openshift-ingress/router-certs-default: failed to sync secret cache: timed out waiting for the condition Feb 17 00:08:03 crc kubenswrapper[4791]: E0217 00:08:03.152733 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/459f3992-b770-44d7-9ecc-0ae8a228134f-service-ca-bundle podName:459f3992-b770-44d7-9ecc-0ae8a228134f nodeName:}" failed. No retries permitted until 2026-02-17 00:08:03.652716538 +0000 UTC m=+141.132229175 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/459f3992-b770-44d7-9ecc-0ae8a228134f-service-ca-bundle") pod "router-default-5444994796-kt8q6" (UID: "459f3992-b770-44d7-9ecc-0ae8a228134f") : failed to sync configmap cache: timed out waiting for the condition Feb 17 00:08:03 crc kubenswrapper[4791]: E0217 00:08:03.152797 4791 secret.go:188] Couldn't get secret openshift-ingress/router-metrics-certs-default: failed to sync secret cache: timed out waiting for the condition Feb 17 00:08:03 crc kubenswrapper[4791]: E0217 00:08:03.152831 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/459f3992-b770-44d7-9ecc-0ae8a228134f-default-certificate podName:459f3992-b770-44d7-9ecc-0ae8a228134f nodeName:}" failed. No retries permitted until 2026-02-17 00:08:03.652805601 +0000 UTC m=+141.132318158 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-certificate" (UniqueName: "kubernetes.io/secret/459f3992-b770-44d7-9ecc-0ae8a228134f-default-certificate") pod "router-default-5444994796-kt8q6" (UID: "459f3992-b770-44d7-9ecc-0ae8a228134f") : failed to sync secret cache: timed out waiting for the condition Feb 17 00:08:03 crc kubenswrapper[4791]: E0217 00:08:03.152855 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/459f3992-b770-44d7-9ecc-0ae8a228134f-metrics-certs podName:459f3992-b770-44d7-9ecc-0ae8a228134f nodeName:}" failed. No retries permitted until 2026-02-17 00:08:03.652841142 +0000 UTC m=+141.132353679 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/459f3992-b770-44d7-9ecc-0ae8a228134f-metrics-certs") pod "router-default-5444994796-kt8q6" (UID: "459f3992-b770-44d7-9ecc-0ae8a228134f") : failed to sync secret cache: timed out waiting for the condition Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.157065 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.176887 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.196852 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.217055 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.236677 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.257125 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.276910 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.296246 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.316928 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.337921 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.356813 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.398518 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.417435 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.438408 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.457480 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.477089 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.498171 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.517388 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.537346 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.557536 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.589525 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.597672 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.617743 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.638508 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.657313 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.697961 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.703519 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/459f3992-b770-44d7-9ecc-0ae8a228134f-stats-auth\") pod \"router-default-5444994796-kt8q6\" (UID: \"459f3992-b770-44d7-9ecc-0ae8a228134f\") " pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.703683 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/459f3992-b770-44d7-9ecc-0ae8a228134f-default-certificate\") pod \"router-default-5444994796-kt8q6\" (UID: \"459f3992-b770-44d7-9ecc-0ae8a228134f\") " pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.703745 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/459f3992-b770-44d7-9ecc-0ae8a228134f-service-ca-bundle\") pod \"router-default-5444994796-kt8q6\" (UID: \"459f3992-b770-44d7-9ecc-0ae8a228134f\") " pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.703801 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/459f3992-b770-44d7-9ecc-0ae8a228134f-metrics-certs\") pod \"router-default-5444994796-kt8q6\" (UID: \"459f3992-b770-44d7-9ecc-0ae8a228134f\") " pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.705783 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/459f3992-b770-44d7-9ecc-0ae8a228134f-service-ca-bundle\") pod \"router-default-5444994796-kt8q6\" (UID: \"459f3992-b770-44d7-9ecc-0ae8a228134f\") " pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.707851 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/459f3992-b770-44d7-9ecc-0ae8a228134f-stats-auth\") pod \"router-default-5444994796-kt8q6\" (UID: \"459f3992-b770-44d7-9ecc-0ae8a228134f\") " pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.708490 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/459f3992-b770-44d7-9ecc-0ae8a228134f-metrics-certs\") pod \"router-default-5444994796-kt8q6\" (UID: \"459f3992-b770-44d7-9ecc-0ae8a228134f\") " pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.711008 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/459f3992-b770-44d7-9ecc-0ae8a228134f-default-certificate\") pod \"router-default-5444994796-kt8q6\" (UID: \"459f3992-b770-44d7-9ecc-0ae8a228134f\") " pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.717301 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.737591 4791 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.756798 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.776616 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.797977 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.839387 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pgvc\" (UniqueName: \"kubernetes.io/projected/4360bf41-9e45-498e-8f94-2c43a0dc88e5-kube-api-access-8pgvc\") pod \"console-operator-58897d9998-t5827\" (UID: \"4360bf41-9e45-498e-8f94-2c43a0dc88e5\") " pod="openshift-console-operator/console-operator-58897d9998-t5827" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.854312 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbqwz\" (UniqueName: \"kubernetes.io/projected/1b1913d4-85d3-4596-acea-6e272cf81e8e-kube-api-access-cbqwz\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.874915 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp6bb\" (UniqueName: \"kubernetes.io/projected/839b6744-bbe6-4b56-b020-181d86c604fe-kube-api-access-pp6bb\") pod \"machine-approver-56656f9798-d7qrh\" (UID: \"839b6744-bbe6-4b56-b020-181d86c604fe\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d7qrh" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.894607 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4ssg\" (UniqueName: \"kubernetes.io/projected/5f00b345-a265-41cc-89b7-6f059fc4d5d1-kube-api-access-f4ssg\") pod \"cluster-samples-operator-665b6dd947-qzgkr\" (UID: \"5f00b345-a265-41cc-89b7-6f059fc4d5d1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qzgkr" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.916214 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4kmn\" (UniqueName: \"kubernetes.io/projected/5526b957-e33f-4952-8dda-d2875c94686a-kube-api-access-w4kmn\") pod \"etcd-operator-b45778765-stqb9\" (UID: \"5526b957-e33f-4952-8dda-d2875c94686a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stqb9" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.941531 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9z7g\" (UniqueName: \"kubernetes.io/projected/d15f89df-5eaf-48c8-963d-bc3e1c79bd43-kube-api-access-x9z7g\") pod \"authentication-operator-69f744f599-49n75\" (UID: \"d15f89df-5eaf-48c8-963d-bc3e1c79bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-49n75" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.950118 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qzgkr" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.961000 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9pgd\" (UniqueName: \"kubernetes.io/projected/3d469ce1-e7ed-4826-a378-0de16f2b4e56-kube-api-access-m9pgd\") pod \"openshift-apiserver-operator-796bbdcf4f-8lhkg\" (UID: \"3d469ce1-e7ed-4826-a378-0de16f2b4e56\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8lhkg" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.973306 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgt9z\" (UniqueName: \"kubernetes.io/projected/94401a93-55c7-4e8b-83f7-dc27a876f335-kube-api-access-cgt9z\") pod \"image-pruner-29521440-k6f7k\" (UID: \"94401a93-55c7-4e8b-83f7-dc27a876f335\") " pod="openshift-image-registry/image-pruner-29521440-k6f7k" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.975788 4791 request.go:700] Waited for 1.917160841s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/serviceaccounts/console/token Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.992486 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt7pk\" (UniqueName: \"kubernetes.io/projected/155619c1-12ba-4149-9dce-474e3735168c-kube-api-access-gt7pk\") pod \"console-f9d7485db-frmbv\" (UID: \"155619c1-12ba-4149-9dce-474e3735168c\") " pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.997406 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.018084 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.027043 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29521440-k6f7k" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.040603 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.057312 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.067741 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-49n75" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.067752 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d7qrh" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.076633 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8lhkg" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.081855 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnsxm\" (UniqueName: \"kubernetes.io/projected/b7f0fa93-740f-43aa-9350-24d9920a9345-kube-api-access-gnsxm\") pod \"cluster-image-registry-operator-dc59b4c8b-fpr65\" (UID: \"b7f0fa93-740f-43aa-9350-24d9920a9345\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fpr65" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.109032 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsnph\" (UniqueName: \"kubernetes.io/projected/0522c983-dae6-41ca-807a-ff45912a0024-kube-api-access-fsnph\") pod \"downloads-7954f5f757-rt865\" (UID: \"0522c983-dae6-41ca-807a-ff45912a0024\") " pod="openshift-console/downloads-7954f5f757-rt865" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.117357 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws66n\" (UniqueName: \"kubernetes.io/projected/c70fe9d3-348d-4bb8-89f7-21027041131a-kube-api-access-ws66n\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.121271 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-t5827" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.129216 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.138427 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-stqb9" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.140486 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8zlc\" (UniqueName: \"kubernetes.io/projected/713c3460-f77d-4f7b-81bf-911f8f875dfe-kube-api-access-k8zlc\") pod \"controller-manager-879f6c89f-jftdn\" (UID: \"713c3460-f77d-4f7b-81bf-911f8f875dfe\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.153716 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8kss\" (UniqueName: \"kubernetes.io/projected/643578b4-75ca-4765-8df5-9167688e3ced-kube-api-access-x8kss\") pod \"apiserver-7bbb656c7d-sgzjl\" (UID: \"643578b4-75ca-4765-8df5-9167688e3ced\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.156980 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.194652 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.202489 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw65g\" (UniqueName: \"kubernetes.io/projected/ad62ba3d-c60a-4e1f-9768-187e74151f24-kube-api-access-vw65g\") pod \"openshift-config-operator-7777fb866f-qxk8k\" (UID: \"ad62ba3d-c60a-4e1f-9768-187e74151f24\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxk8k" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.212439 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b7f0fa93-740f-43aa-9350-24d9920a9345-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-fpr65\" (UID: \"b7f0fa93-740f-43aa-9350-24d9920a9345\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fpr65" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.233554 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.234072 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdjvm\" (UniqueName: \"kubernetes.io/projected/6a866a69-9159-4dd1-a03d-b2a0f703fb7b-kube-api-access-pdjvm\") pod \"route-controller-manager-6576b87f9c-ht455\" (UID: \"6a866a69-9159-4dd1-a03d-b2a0f703fb7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.253576 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49kkg\" (UniqueName: \"kubernetes.io/projected/9c752f56-7754-4718-aea5-cb41d6ac4253-kube-api-access-49kkg\") pod \"machine-api-operator-5694c8668f-5nwz7\" (UID: \"9c752f56-7754-4718-aea5-cb41d6ac4253\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5nwz7" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.271801 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.272550 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b747aa6-3874-4f71-86bb-d340398d7bc4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-8nb5r\" (UID: \"0b747aa6-3874-4f71-86bb-d340398d7bc4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nb5r" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.284456 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29521440-k6f7k"] Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.291320 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbb6h\" (UniqueName: \"kubernetes.io/projected/03d7a8df-a8a3-4b34-bd28-d554ae70875a-kube-api-access-zbb6h\") pod \"control-plane-machine-set-operator-78cbb6b69f-kqr99\" (UID: \"03d7a8df-a8a3-4b34-bd28-d554ae70875a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kqr99" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.309868 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-rt865" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.324391 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5e2cf0a7-86f6-4858-98ad-08c4c644deb9-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fpv5b\" (UID: \"5e2cf0a7-86f6-4858-98ad-08c4c644deb9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fpv5b" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.332066 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g6bf\" (UniqueName: \"kubernetes.io/projected/e3032312-913c-4072-ac18-56fdc689cbac-kube-api-access-8g6bf\") pod \"kube-storage-version-migrator-operator-b67b599dd-fs52s\" (UID: \"e3032312-913c-4072-ac18-56fdc689cbac\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fs52s" Feb 17 00:08:04 crc kubenswrapper[4791]: W0217 00:08:04.334717 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94401a93_55c7_4e8b_83f7_dc27a876f335.slice/crio-142ab3f00f5019c1be7909e2cdfb055edb77c1758b592a88fa5914ec63b0bda2 WatchSource:0}: Error finding container 142ab3f00f5019c1be7909e2cdfb055edb77c1758b592a88fa5914ec63b0bda2: Status 404 returned error can't find the container with id 142ab3f00f5019c1be7909e2cdfb055edb77c1758b592a88fa5914ec63b0bda2 Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.363773 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv2ts\" (UniqueName: \"kubernetes.io/projected/fe44c059-87ef-4805-b78f-b8c3cdfd844e-kube-api-access-rv2ts\") pod \"machine-config-operator-74547568cd-n2s28\" (UID: \"fe44c059-87ef-4805-b78f-b8c3cdfd844e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n2s28" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.364280 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qzgkr"] Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.376357 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2jpw\" (UniqueName: \"kubernetes.io/projected/9578978b-522d-48d8-9b08-384752fc49a1-kube-api-access-k2jpw\") pod \"service-ca-9c57cc56f-5jsj6\" (UID: \"9578978b-522d-48d8-9b08-384752fc49a1\") " pod="openshift-service-ca/service-ca-9c57cc56f-5jsj6" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.382251 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxk8k" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.388395 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.392466 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4qdh\" (UniqueName: \"kubernetes.io/projected/71967495-8841-4810-89e5-e114b9887c5e-kube-api-access-l4qdh\") pod \"package-server-manager-789f6589d5-r2rv7\" (UID: \"71967495-8841-4810-89e5-e114b9887c5e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r2rv7" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.397563 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.404903 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fpr65" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.409062 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qrvj\" (UniqueName: \"kubernetes.io/projected/459f3992-b770-44d7-9ecc-0ae8a228134f-kube-api-access-2qrvj\") pod \"router-default-5444994796-kt8q6\" (UID: \"459f3992-b770-44d7-9ecc-0ae8a228134f\") " pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.413023 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-5nwz7" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.430183 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svp66\" (UniqueName: \"kubernetes.io/projected/c50f96b4-3a86-4edc-b9d5-82fe3181b8a4-kube-api-access-svp66\") pod \"multus-admission-controller-857f4d67dd-7pqm8\" (UID: \"c50f96b4-3a86-4edc-b9d5-82fe3181b8a4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7pqm8" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.457849 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-966rv\" (UniqueName: \"kubernetes.io/projected/b6c19ecc-0208-46de-8c03-6780bba30353-kube-api-access-966rv\") pod \"packageserver-d55dfcdfc-bt9mf\" (UID: \"b6c19ecc-0208-46de-8c03-6780bba30353\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bt9mf" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.476772 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvqmg\" (UniqueName: \"kubernetes.io/projected/ee8894f1-34ac-4df2-bf1a-01e1a110d6c9-kube-api-access-wvqmg\") pod \"dns-operator-744455d44c-72t6m\" (UID: \"ee8894f1-34ac-4df2-bf1a-01e1a110d6c9\") " pod="openshift-dns-operator/dns-operator-744455d44c-72t6m" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.492786 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5ph6\" (UniqueName: \"kubernetes.io/projected/c5fb65f7-7cc6-4834-853e-a91eebc956fd-kube-api-access-r5ph6\") pod \"migrator-59844c95c7-952sm\" (UID: \"c5fb65f7-7cc6-4834-853e-a91eebc956fd\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-952sm" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.511739 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fpv5b" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.525353 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c33165ce-519a-4b0e-b62a-f153d38fc14c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.525408 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9kxv\" (UniqueName: \"kubernetes.io/projected/3baa46c8-1f0b-4b6f-95bb-94368bf6cc23-kube-api-access-l9kxv\") pod \"machine-config-controller-84d6567774-vb8fv\" (UID: \"3baa46c8-1f0b-4b6f-95bb-94368bf6cc23\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vb8fv" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.525444 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c33165ce-519a-4b0e-b62a-f153d38fc14c-bound-sa-token\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.525514 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/afb516ca-988f-4b77-aea0-10cd22ce2b77-metrics-tls\") pod \"ingress-operator-5b745b69d9-j7w5g\" (UID: \"afb516ca-988f-4b77-aea0-10cd22ce2b77\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j7w5g" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.525540 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8543e6a7-7bb0-4a35-96c5-bcae0763cc78-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-57qch\" (UID: \"8543e6a7-7bb0-4a35-96c5-bcae0763cc78\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57qch" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.525589 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4dde99c9-4e15-4bdc-ba17-5d00e4117b1c-secret-volume\") pod \"collect-profiles-29521440-crl2x\" (UID: \"4dde99c9-4e15-4bdc-ba17-5d00e4117b1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521440-crl2x" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.525652 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c33165ce-519a-4b0e-b62a-f153d38fc14c-registry-certificates\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.525675 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8543e6a7-7bb0-4a35-96c5-bcae0763cc78-config\") pod \"kube-controller-manager-operator-78b949d7b-57qch\" (UID: \"8543e6a7-7bb0-4a35-96c5-bcae0763cc78\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57qch" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.525706 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/afb516ca-988f-4b77-aea0-10cd22ce2b77-bound-sa-token\") pod \"ingress-operator-5b745b69d9-j7w5g\" (UID: \"afb516ca-988f-4b77-aea0-10cd22ce2b77\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j7w5g" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.525739 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4dde99c9-4e15-4bdc-ba17-5d00e4117b1c-config-volume\") pod \"collect-profiles-29521440-crl2x\" (UID: \"4dde99c9-4e15-4bdc-ba17-5d00e4117b1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521440-crl2x" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.525761 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c33165ce-519a-4b0e-b62a-f153d38fc14c-trusted-ca\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.525785 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7bf3a6ce-9a2b-49cc-9360-f552988f2b38-profile-collector-cert\") pod \"olm-operator-6b444d44fb-dhmqq\" (UID: \"7bf3a6ce-9a2b-49cc-9360-f552988f2b38\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhmqq" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.525807 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8htvc\" (UniqueName: \"kubernetes.io/projected/7bf3a6ce-9a2b-49cc-9360-f552988f2b38-kube-api-access-8htvc\") pod \"olm-operator-6b444d44fb-dhmqq\" (UID: \"7bf3a6ce-9a2b-49cc-9360-f552988f2b38\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhmqq" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.525845 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c33165ce-519a-4b0e-b62a-f153d38fc14c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.525906 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kkpm\" (UniqueName: \"kubernetes.io/projected/57ed01e7-bfeb-428e-88c2-371662581ddf-kube-api-access-6kkpm\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nlbr\" (UID: \"57ed01e7-bfeb-428e-88c2-371662581ddf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nlbr" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.525930 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3baa46c8-1f0b-4b6f-95bb-94368bf6cc23-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-vb8fv\" (UID: \"3baa46c8-1f0b-4b6f-95bb-94368bf6cc23\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vb8fv" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.525967 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8543e6a7-7bb0-4a35-96c5-bcae0763cc78-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-57qch\" (UID: \"8543e6a7-7bb0-4a35-96c5-bcae0763cc78\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57qch" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.525992 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptqw5\" (UniqueName: \"kubernetes.io/projected/c33165ce-519a-4b0e-b62a-f153d38fc14c-kube-api-access-ptqw5\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.526010 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3baa46c8-1f0b-4b6f-95bb-94368bf6cc23-proxy-tls\") pod \"machine-config-controller-84d6567774-vb8fv\" (UID: \"3baa46c8-1f0b-4b6f-95bb-94368bf6cc23\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vb8fv" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.526033 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvk22\" (UniqueName: \"kubernetes.io/projected/4dde99c9-4e15-4bdc-ba17-5d00e4117b1c-kube-api-access-tvk22\") pod \"collect-profiles-29521440-crl2x\" (UID: \"4dde99c9-4e15-4bdc-ba17-5d00e4117b1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521440-crl2x" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.526082 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57ed01e7-bfeb-428e-88c2-371662581ddf-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nlbr\" (UID: \"57ed01e7-bfeb-428e-88c2-371662581ddf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nlbr" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.526182 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c33165ce-519a-4b0e-b62a-f153d38fc14c-registry-tls\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.526206 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57ed01e7-bfeb-428e-88c2-371662581ddf-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nlbr\" (UID: \"57ed01e7-bfeb-428e-88c2-371662581ddf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nlbr" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.526230 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/afb516ca-988f-4b77-aea0-10cd22ce2b77-trusted-ca\") pod \"ingress-operator-5b745b69d9-j7w5g\" (UID: \"afb516ca-988f-4b77-aea0-10cd22ce2b77\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j7w5g" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.526359 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hr4t\" (UniqueName: \"kubernetes.io/projected/afb516ca-988f-4b77-aea0-10cd22ce2b77-kube-api-access-8hr4t\") pod \"ingress-operator-5b745b69d9-j7w5g\" (UID: \"afb516ca-988f-4b77-aea0-10cd22ce2b77\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j7w5g" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.526431 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.526480 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7bf3a6ce-9a2b-49cc-9360-f552988f2b38-srv-cert\") pod \"olm-operator-6b444d44fb-dhmqq\" (UID: \"7bf3a6ce-9a2b-49cc-9360-f552988f2b38\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhmqq" Feb 17 00:08:04 crc kubenswrapper[4791]: E0217 00:08:04.527809 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:05.02779327 +0000 UTC m=+142.507305887 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.525034 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fs52s" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.535454 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nb5r" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.552067 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-7pqm8" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.559041 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kqr99" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.559170 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-flvjk"] Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.565958 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-49n75"] Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.584975 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bt9mf" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.594090 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-5jsj6" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.604752 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n2s28" Feb 17 00:08:04 crc kubenswrapper[4791]: W0217 00:08:04.607672 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b1913d4_85d3_4596_acea_6e272cf81e8e.slice/crio-dd92dbba7d84f967b1d4396b47a319688a637672b5b28895be564fb0425a3769 WatchSource:0}: Error finding container dd92dbba7d84f967b1d4396b47a319688a637672b5b28895be564fb0425a3769: Status 404 returned error can't find the container with id dd92dbba7d84f967b1d4396b47a319688a637672b5b28895be564fb0425a3769 Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.614479 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-952sm" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.627034 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.627263 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57ed01e7-bfeb-428e-88c2-371662581ddf-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nlbr\" (UID: \"57ed01e7-bfeb-428e-88c2-371662581ddf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nlbr" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.627289 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eeffaf81-97bf-4570-b2f4-4692c4bda9ac-metrics-tls\") pod \"dns-default-4chtt\" (UID: \"eeffaf81-97bf-4570-b2f4-4692c4bda9ac\") " pod="openshift-dns/dns-default-4chtt" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.627308 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1ef4b62b-1af2-4d7b-90cf-3ac35a4bf101-profile-collector-cert\") pod \"catalog-operator-68c6474976-hb9dg\" (UID: \"1ef4b62b-1af2-4d7b-90cf-3ac35a4bf101\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hb9dg" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.627325 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c33165ce-519a-4b0e-b62a-f153d38fc14c-registry-tls\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.627340 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57ed01e7-bfeb-428e-88c2-371662581ddf-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nlbr\" (UID: \"57ed01e7-bfeb-428e-88c2-371662581ddf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nlbr" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.627358 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/811e9e22-1241-440a-9a6a-a6c51a0f0f7c-plugins-dir\") pod \"csi-hostpathplugin-tlpgd\" (UID: \"811e9e22-1241-440a-9a6a-a6c51a0f0f7c\") " pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" Feb 17 00:08:04 crc kubenswrapper[4791]: E0217 00:08:04.627677 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:05.12762378 +0000 UTC m=+142.607136307 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.628357 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57ed01e7-bfeb-428e-88c2-371662581ddf-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nlbr\" (UID: \"57ed01e7-bfeb-428e-88c2-371662581ddf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nlbr" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.628418 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/afb516ca-988f-4b77-aea0-10cd22ce2b77-trusted-ca\") pod \"ingress-operator-5b745b69d9-j7w5g\" (UID: \"afb516ca-988f-4b77-aea0-10cd22ce2b77\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j7w5g" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.628443 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkz8g\" (UniqueName: \"kubernetes.io/projected/1ce1b285-b6aa-4361-aa9e-5274a9863b6a-kube-api-access-kkz8g\") pod \"ingress-canary-946wq\" (UID: \"1ce1b285-b6aa-4361-aa9e-5274a9863b6a\") " pod="openshift-ingress-canary/ingress-canary-946wq" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.629354 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/afb516ca-988f-4b77-aea0-10cd22ce2b77-trusted-ca\") pod \"ingress-operator-5b745b69d9-j7w5g\" (UID: \"afb516ca-988f-4b77-aea0-10cd22ce2b77\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j7w5g" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.629703 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hr4t\" (UniqueName: \"kubernetes.io/projected/afb516ca-988f-4b77-aea0-10cd22ce2b77-kube-api-access-8hr4t\") pod \"ingress-operator-5b745b69d9-j7w5g\" (UID: \"afb516ca-988f-4b77-aea0-10cd22ce2b77\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j7w5g" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.629793 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.629814 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d2cd3da0-c3e4-461c-93ed-064c8b3b0edd-certs\") pod \"machine-config-server-5bsn7\" (UID: \"d2cd3da0-c3e4-461c-93ed-064c8b3b0edd\") " pod="openshift-machine-config-operator/machine-config-server-5bsn7" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.630608 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7bf3a6ce-9a2b-49cc-9360-f552988f2b38-srv-cert\") pod \"olm-operator-6b444d44fb-dhmqq\" (UID: \"7bf3a6ce-9a2b-49cc-9360-f552988f2b38\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhmqq" Feb 17 00:08:04 crc kubenswrapper[4791]: E0217 00:08:04.630624 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:05.130610436 +0000 UTC m=+142.610122963 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.649710 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1ef4b62b-1af2-4d7b-90cf-3ac35a4bf101-srv-cert\") pod \"catalog-operator-68c6474976-hb9dg\" (UID: \"1ef4b62b-1af2-4d7b-90cf-3ac35a4bf101\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hb9dg" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.649794 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ce1b285-b6aa-4361-aa9e-5274a9863b6a-cert\") pod \"ingress-canary-946wq\" (UID: \"1ce1b285-b6aa-4361-aa9e-5274a9863b6a\") " pod="openshift-ingress-canary/ingress-canary-946wq" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.649815 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/811e9e22-1241-440a-9a6a-a6c51a0f0f7c-registration-dir\") pod \"csi-hostpathplugin-tlpgd\" (UID: \"811e9e22-1241-440a-9a6a-a6c51a0f0f7c\") " pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.649845 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c33165ce-519a-4b0e-b62a-f153d38fc14c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.649888 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/811e9e22-1241-440a-9a6a-a6c51a0f0f7c-socket-dir\") pod \"csi-hostpathplugin-tlpgd\" (UID: \"811e9e22-1241-440a-9a6a-a6c51a0f0f7c\") " pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.649914 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l6xj\" (UniqueName: \"kubernetes.io/projected/13a5be44-f180-42a9-bff7-8ba69cc589f0-kube-api-access-2l6xj\") pod \"marketplace-operator-79b997595-bfffb\" (UID: \"13a5be44-f180-42a9-bff7-8ba69cc589f0\") " pod="openshift-marketplace/marketplace-operator-79b997595-bfffb" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.649939 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9kxv\" (UniqueName: \"kubernetes.io/projected/3baa46c8-1f0b-4b6f-95bb-94368bf6cc23-kube-api-access-l9kxv\") pod \"machine-config-controller-84d6567774-vb8fv\" (UID: \"3baa46c8-1f0b-4b6f-95bb-94368bf6cc23\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vb8fv" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.649964 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46st9\" (UniqueName: \"kubernetes.io/projected/eeffaf81-97bf-4570-b2f4-4692c4bda9ac-kube-api-access-46st9\") pod \"dns-default-4chtt\" (UID: \"eeffaf81-97bf-4570-b2f4-4692c4bda9ac\") " pod="openshift-dns/dns-default-4chtt" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.650062 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c33165ce-519a-4b0e-b62a-f153d38fc14c-bound-sa-token\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.650188 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae8af772-70a9-4758-b597-363c1db463ad-config\") pod \"service-ca-operator-777779d784-xc8t7\" (UID: \"ae8af772-70a9-4758-b597-363c1db463ad\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xc8t7" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.650226 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/afb516ca-988f-4b77-aea0-10cd22ce2b77-metrics-tls\") pod \"ingress-operator-5b745b69d9-j7w5g\" (UID: \"afb516ca-988f-4b77-aea0-10cd22ce2b77\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j7w5g" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.650294 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8543e6a7-7bb0-4a35-96c5-bcae0763cc78-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-57qch\" (UID: \"8543e6a7-7bb0-4a35-96c5-bcae0763cc78\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57qch" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.650318 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5vvm\" (UniqueName: \"kubernetes.io/projected/811e9e22-1241-440a-9a6a-a6c51a0f0f7c-kube-api-access-n5vvm\") pod \"csi-hostpathplugin-tlpgd\" (UID: \"811e9e22-1241-440a-9a6a-a6c51a0f0f7c\") " pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.650339 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/811e9e22-1241-440a-9a6a-a6c51a0f0f7c-csi-data-dir\") pod \"csi-hostpathplugin-tlpgd\" (UID: \"811e9e22-1241-440a-9a6a-a6c51a0f0f7c\") " pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.650381 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4dde99c9-4e15-4bdc-ba17-5d00e4117b1c-secret-volume\") pod \"collect-profiles-29521440-crl2x\" (UID: \"4dde99c9-4e15-4bdc-ba17-5d00e4117b1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521440-crl2x" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.651273 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c33165ce-519a-4b0e-b62a-f153d38fc14c-registry-tls\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.651352 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/13a5be44-f180-42a9-bff7-8ba69cc589f0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bfffb\" (UID: \"13a5be44-f180-42a9-bff7-8ba69cc589f0\") " pod="openshift-marketplace/marketplace-operator-79b997595-bfffb" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.651517 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/811e9e22-1241-440a-9a6a-a6c51a0f0f7c-mountpoint-dir\") pod \"csi-hostpathplugin-tlpgd\" (UID: \"811e9e22-1241-440a-9a6a-a6c51a0f0f7c\") " pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.651601 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c33165ce-519a-4b0e-b62a-f153d38fc14c-registry-certificates\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.651624 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8543e6a7-7bb0-4a35-96c5-bcae0763cc78-config\") pod \"kube-controller-manager-operator-78b949d7b-57qch\" (UID: \"8543e6a7-7bb0-4a35-96c5-bcae0763cc78\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57qch" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.651649 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eeffaf81-97bf-4570-b2f4-4692c4bda9ac-config-volume\") pod \"dns-default-4chtt\" (UID: \"eeffaf81-97bf-4570-b2f4-4692c4bda9ac\") " pod="openshift-dns/dns-default-4chtt" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.653007 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/afb516ca-988f-4b77-aea0-10cd22ce2b77-bound-sa-token\") pod \"ingress-operator-5b745b69d9-j7w5g\" (UID: \"afb516ca-988f-4b77-aea0-10cd22ce2b77\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j7w5g" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.653042 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fz5j\" (UniqueName: \"kubernetes.io/projected/ae8af772-70a9-4758-b597-363c1db463ad-kube-api-access-7fz5j\") pod \"service-ca-operator-777779d784-xc8t7\" (UID: \"ae8af772-70a9-4758-b597-363c1db463ad\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xc8t7" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.653386 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4dde99c9-4e15-4bdc-ba17-5d00e4117b1c-config-volume\") pod \"collect-profiles-29521440-crl2x\" (UID: \"4dde99c9-4e15-4bdc-ba17-5d00e4117b1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521440-crl2x" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.653419 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/13a5be44-f180-42a9-bff7-8ba69cc589f0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bfffb\" (UID: \"13a5be44-f180-42a9-bff7-8ba69cc589f0\") " pod="openshift-marketplace/marketplace-operator-79b997595-bfffb" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.653444 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c33165ce-519a-4b0e-b62a-f153d38fc14c-trusted-ca\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.653466 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7bf3a6ce-9a2b-49cc-9360-f552988f2b38-profile-collector-cert\") pod \"olm-operator-6b444d44fb-dhmqq\" (UID: \"7bf3a6ce-9a2b-49cc-9360-f552988f2b38\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhmqq" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.653491 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8htvc\" (UniqueName: \"kubernetes.io/projected/7bf3a6ce-9a2b-49cc-9360-f552988f2b38-kube-api-access-8htvc\") pod \"olm-operator-6b444d44fb-dhmqq\" (UID: \"7bf3a6ce-9a2b-49cc-9360-f552988f2b38\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhmqq" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.653513 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae8af772-70a9-4758-b597-363c1db463ad-serving-cert\") pod \"service-ca-operator-777779d784-xc8t7\" (UID: \"ae8af772-70a9-4758-b597-363c1db463ad\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xc8t7" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.655090 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57ed01e7-bfeb-428e-88c2-371662581ddf-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nlbr\" (UID: \"57ed01e7-bfeb-428e-88c2-371662581ddf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nlbr" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.655911 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c33165ce-519a-4b0e-b62a-f153d38fc14c-registry-certificates\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.656467 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8543e6a7-7bb0-4a35-96c5-bcae0763cc78-config\") pod \"kube-controller-manager-operator-78b949d7b-57qch\" (UID: \"8543e6a7-7bb0-4a35-96c5-bcae0763cc78\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57qch" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.659602 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-frmbv"] Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.660090 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4dde99c9-4e15-4bdc-ba17-5d00e4117b1c-config-volume\") pod \"collect-profiles-29521440-crl2x\" (UID: \"4dde99c9-4e15-4bdc-ba17-5d00e4117b1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521440-crl2x" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.661092 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r2rv7" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.662740 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c33165ce-519a-4b0e-b62a-f153d38fc14c-trusted-ca\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.667099 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg877\" (UniqueName: \"kubernetes.io/projected/d2cd3da0-c3e4-461c-93ed-064c8b3b0edd-kube-api-access-lg877\") pod \"machine-config-server-5bsn7\" (UID: \"d2cd3da0-c3e4-461c-93ed-064c8b3b0edd\") " pod="openshift-machine-config-operator/machine-config-server-5bsn7" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.670465 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c33165ce-519a-4b0e-b62a-f153d38fc14c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.671376 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d2cd3da0-c3e4-461c-93ed-064c8b3b0edd-node-bootstrap-token\") pod \"machine-config-server-5bsn7\" (UID: \"d2cd3da0-c3e4-461c-93ed-064c8b3b0edd\") " pod="openshift-machine-config-operator/machine-config-server-5bsn7" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.671911 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c33165ce-519a-4b0e-b62a-f153d38fc14c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.672165 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kkpm\" (UniqueName: \"kubernetes.io/projected/57ed01e7-bfeb-428e-88c2-371662581ddf-kube-api-access-6kkpm\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nlbr\" (UID: \"57ed01e7-bfeb-428e-88c2-371662581ddf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nlbr" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.673001 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.673238 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3baa46c8-1f0b-4b6f-95bb-94368bf6cc23-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-vb8fv\" (UID: \"3baa46c8-1f0b-4b6f-95bb-94368bf6cc23\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vb8fv" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.675811 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8543e6a7-7bb0-4a35-96c5-bcae0763cc78-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-57qch\" (UID: \"8543e6a7-7bb0-4a35-96c5-bcae0763cc78\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57qch" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.676275 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3baa46c8-1f0b-4b6f-95bb-94368bf6cc23-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-vb8fv\" (UID: \"3baa46c8-1f0b-4b6f-95bb-94368bf6cc23\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vb8fv" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.677003 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptqw5\" (UniqueName: \"kubernetes.io/projected/c33165ce-519a-4b0e-b62a-f153d38fc14c-kube-api-access-ptqw5\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.677044 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3baa46c8-1f0b-4b6f-95bb-94368bf6cc23-proxy-tls\") pod \"machine-config-controller-84d6567774-vb8fv\" (UID: \"3baa46c8-1f0b-4b6f-95bb-94368bf6cc23\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vb8fv" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.677522 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvk22\" (UniqueName: \"kubernetes.io/projected/4dde99c9-4e15-4bdc-ba17-5d00e4117b1c-kube-api-access-tvk22\") pod \"collect-profiles-29521440-crl2x\" (UID: \"4dde99c9-4e15-4bdc-ba17-5d00e4117b1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521440-crl2x" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.678212 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcqqr\" (UniqueName: \"kubernetes.io/projected/1ef4b62b-1af2-4d7b-90cf-3ac35a4bf101-kube-api-access-fcqqr\") pod \"catalog-operator-68c6474976-hb9dg\" (UID: \"1ef4b62b-1af2-4d7b-90cf-3ac35a4bf101\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hb9dg" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.678990 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/afb516ca-988f-4b77-aea0-10cd22ce2b77-metrics-tls\") pod \"ingress-operator-5b745b69d9-j7w5g\" (UID: \"afb516ca-988f-4b77-aea0-10cd22ce2b77\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j7w5g" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.681202 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c33165ce-519a-4b0e-b62a-f153d38fc14c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.684424 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7bf3a6ce-9a2b-49cc-9360-f552988f2b38-profile-collector-cert\") pod \"olm-operator-6b444d44fb-dhmqq\" (UID: \"7bf3a6ce-9a2b-49cc-9360-f552988f2b38\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhmqq" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.685723 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8543e6a7-7bb0-4a35-96c5-bcae0763cc78-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-57qch\" (UID: \"8543e6a7-7bb0-4a35-96c5-bcae0763cc78\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57qch" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.686362 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3baa46c8-1f0b-4b6f-95bb-94368bf6cc23-proxy-tls\") pod \"machine-config-controller-84d6567774-vb8fv\" (UID: \"3baa46c8-1f0b-4b6f-95bb-94368bf6cc23\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vb8fv" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.687546 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-72t6m" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.689190 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7bf3a6ce-9a2b-49cc-9360-f552988f2b38-srv-cert\") pod \"olm-operator-6b444d44fb-dhmqq\" (UID: \"7bf3a6ce-9a2b-49cc-9360-f552988f2b38\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhmqq" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.712063 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jftdn"] Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.713234 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hr4t\" (UniqueName: \"kubernetes.io/projected/afb516ca-988f-4b77-aea0-10cd22ce2b77-kube-api-access-8hr4t\") pod \"ingress-operator-5b745b69d9-j7w5g\" (UID: \"afb516ca-988f-4b77-aea0-10cd22ce2b77\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j7w5g" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.713331 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4dde99c9-4e15-4bdc-ba17-5d00e4117b1c-secret-volume\") pod \"collect-profiles-29521440-crl2x\" (UID: \"4dde99c9-4e15-4bdc-ba17-5d00e4117b1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521440-crl2x" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.720405 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8543e6a7-7bb0-4a35-96c5-bcae0763cc78-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-57qch\" (UID: \"8543e6a7-7bb0-4a35-96c5-bcae0763cc78\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57qch" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.731740 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8htvc\" (UniqueName: \"kubernetes.io/projected/7bf3a6ce-9a2b-49cc-9360-f552988f2b38-kube-api-access-8htvc\") pod \"olm-operator-6b444d44fb-dhmqq\" (UID: \"7bf3a6ce-9a2b-49cc-9360-f552988f2b38\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhmqq" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.771440 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c33165ce-519a-4b0e-b62a-f153d38fc14c-bound-sa-token\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.779260 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.779503 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkz8g\" (UniqueName: \"kubernetes.io/projected/1ce1b285-b6aa-4361-aa9e-5274a9863b6a-kube-api-access-kkz8g\") pod \"ingress-canary-946wq\" (UID: \"1ce1b285-b6aa-4361-aa9e-5274a9863b6a\") " pod="openshift-ingress-canary/ingress-canary-946wq" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.779545 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d2cd3da0-c3e4-461c-93ed-064c8b3b0edd-certs\") pod \"machine-config-server-5bsn7\" (UID: \"d2cd3da0-c3e4-461c-93ed-064c8b3b0edd\") " pod="openshift-machine-config-operator/machine-config-server-5bsn7" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.779570 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1ef4b62b-1af2-4d7b-90cf-3ac35a4bf101-srv-cert\") pod \"catalog-operator-68c6474976-hb9dg\" (UID: \"1ef4b62b-1af2-4d7b-90cf-3ac35a4bf101\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hb9dg" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.779588 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ce1b285-b6aa-4361-aa9e-5274a9863b6a-cert\") pod \"ingress-canary-946wq\" (UID: \"1ce1b285-b6aa-4361-aa9e-5274a9863b6a\") " pod="openshift-ingress-canary/ingress-canary-946wq" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.779607 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/811e9e22-1241-440a-9a6a-a6c51a0f0f7c-registration-dir\") pod \"csi-hostpathplugin-tlpgd\" (UID: \"811e9e22-1241-440a-9a6a-a6c51a0f0f7c\") " pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.779634 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/811e9e22-1241-440a-9a6a-a6c51a0f0f7c-socket-dir\") pod \"csi-hostpathplugin-tlpgd\" (UID: \"811e9e22-1241-440a-9a6a-a6c51a0f0f7c\") " pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.779659 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2l6xj\" (UniqueName: \"kubernetes.io/projected/13a5be44-f180-42a9-bff7-8ba69cc589f0-kube-api-access-2l6xj\") pod \"marketplace-operator-79b997595-bfffb\" (UID: \"13a5be44-f180-42a9-bff7-8ba69cc589f0\") " pod="openshift-marketplace/marketplace-operator-79b997595-bfffb" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.779677 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46st9\" (UniqueName: \"kubernetes.io/projected/eeffaf81-97bf-4570-b2f4-4692c4bda9ac-kube-api-access-46st9\") pod \"dns-default-4chtt\" (UID: \"eeffaf81-97bf-4570-b2f4-4692c4bda9ac\") " pod="openshift-dns/dns-default-4chtt" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.779709 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae8af772-70a9-4758-b597-363c1db463ad-config\") pod \"service-ca-operator-777779d784-xc8t7\" (UID: \"ae8af772-70a9-4758-b597-363c1db463ad\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xc8t7" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.779725 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5vvm\" (UniqueName: \"kubernetes.io/projected/811e9e22-1241-440a-9a6a-a6c51a0f0f7c-kube-api-access-n5vvm\") pod \"csi-hostpathplugin-tlpgd\" (UID: \"811e9e22-1241-440a-9a6a-a6c51a0f0f7c\") " pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.779748 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/811e9e22-1241-440a-9a6a-a6c51a0f0f7c-csi-data-dir\") pod \"csi-hostpathplugin-tlpgd\" (UID: \"811e9e22-1241-440a-9a6a-a6c51a0f0f7c\") " pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.779767 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/13a5be44-f180-42a9-bff7-8ba69cc589f0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bfffb\" (UID: \"13a5be44-f180-42a9-bff7-8ba69cc589f0\") " pod="openshift-marketplace/marketplace-operator-79b997595-bfffb" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.779782 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/811e9e22-1241-440a-9a6a-a6c51a0f0f7c-mountpoint-dir\") pod \"csi-hostpathplugin-tlpgd\" (UID: \"811e9e22-1241-440a-9a6a-a6c51a0f0f7c\") " pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.779800 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eeffaf81-97bf-4570-b2f4-4692c4bda9ac-config-volume\") pod \"dns-default-4chtt\" (UID: \"eeffaf81-97bf-4570-b2f4-4692c4bda9ac\") " pod="openshift-dns/dns-default-4chtt" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.779829 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fz5j\" (UniqueName: \"kubernetes.io/projected/ae8af772-70a9-4758-b597-363c1db463ad-kube-api-access-7fz5j\") pod \"service-ca-operator-777779d784-xc8t7\" (UID: \"ae8af772-70a9-4758-b597-363c1db463ad\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xc8t7" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.779863 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/13a5be44-f180-42a9-bff7-8ba69cc589f0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bfffb\" (UID: \"13a5be44-f180-42a9-bff7-8ba69cc589f0\") " pod="openshift-marketplace/marketplace-operator-79b997595-bfffb" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.779882 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae8af772-70a9-4758-b597-363c1db463ad-serving-cert\") pod \"service-ca-operator-777779d784-xc8t7\" (UID: \"ae8af772-70a9-4758-b597-363c1db463ad\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xc8t7" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.779906 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lg877\" (UniqueName: \"kubernetes.io/projected/d2cd3da0-c3e4-461c-93ed-064c8b3b0edd-kube-api-access-lg877\") pod \"machine-config-server-5bsn7\" (UID: \"d2cd3da0-c3e4-461c-93ed-064c8b3b0edd\") " pod="openshift-machine-config-operator/machine-config-server-5bsn7" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.779923 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d2cd3da0-c3e4-461c-93ed-064c8b3b0edd-node-bootstrap-token\") pod \"machine-config-server-5bsn7\" (UID: \"d2cd3da0-c3e4-461c-93ed-064c8b3b0edd\") " pod="openshift-machine-config-operator/machine-config-server-5bsn7" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.779979 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcqqr\" (UniqueName: \"kubernetes.io/projected/1ef4b62b-1af2-4d7b-90cf-3ac35a4bf101-kube-api-access-fcqqr\") pod \"catalog-operator-68c6474976-hb9dg\" (UID: \"1ef4b62b-1af2-4d7b-90cf-3ac35a4bf101\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hb9dg" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.779999 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1ef4b62b-1af2-4d7b-90cf-3ac35a4bf101-profile-collector-cert\") pod \"catalog-operator-68c6474976-hb9dg\" (UID: \"1ef4b62b-1af2-4d7b-90cf-3ac35a4bf101\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hb9dg" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.780019 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eeffaf81-97bf-4570-b2f4-4692c4bda9ac-metrics-tls\") pod \"dns-default-4chtt\" (UID: \"eeffaf81-97bf-4570-b2f4-4692c4bda9ac\") " pod="openshift-dns/dns-default-4chtt" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.780036 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/811e9e22-1241-440a-9a6a-a6c51a0f0f7c-plugins-dir\") pod \"csi-hostpathplugin-tlpgd\" (UID: \"811e9e22-1241-440a-9a6a-a6c51a0f0f7c\") " pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.780310 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/811e9e22-1241-440a-9a6a-a6c51a0f0f7c-plugins-dir\") pod \"csi-hostpathplugin-tlpgd\" (UID: \"811e9e22-1241-440a-9a6a-a6c51a0f0f7c\") " pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" Feb 17 00:08:04 crc kubenswrapper[4791]: E0217 00:08:04.780382 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:05.280367261 +0000 UTC m=+142.759879788 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.782685 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/811e9e22-1241-440a-9a6a-a6c51a0f0f7c-mountpoint-dir\") pod \"csi-hostpathplugin-tlpgd\" (UID: \"811e9e22-1241-440a-9a6a-a6c51a0f0f7c\") " pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.783266 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d2cd3da0-c3e4-461c-93ed-064c8b3b0edd-certs\") pod \"machine-config-server-5bsn7\" (UID: \"d2cd3da0-c3e4-461c-93ed-064c8b3b0edd\") " pod="openshift-machine-config-operator/machine-config-server-5bsn7" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.783317 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eeffaf81-97bf-4570-b2f4-4692c4bda9ac-config-volume\") pod \"dns-default-4chtt\" (UID: \"eeffaf81-97bf-4570-b2f4-4692c4bda9ac\") " pod="openshift-dns/dns-default-4chtt" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.784629 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/811e9e22-1241-440a-9a6a-a6c51a0f0f7c-csi-data-dir\") pod \"csi-hostpathplugin-tlpgd\" (UID: \"811e9e22-1241-440a-9a6a-a6c51a0f0f7c\") " pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.785678 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/811e9e22-1241-440a-9a6a-a6c51a0f0f7c-registration-dir\") pod \"csi-hostpathplugin-tlpgd\" (UID: \"811e9e22-1241-440a-9a6a-a6c51a0f0f7c\") " pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.788685 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae8af772-70a9-4758-b597-363c1db463ad-config\") pod \"service-ca-operator-777779d784-xc8t7\" (UID: \"ae8af772-70a9-4758-b597-363c1db463ad\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xc8t7" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.789277 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9kxv\" (UniqueName: \"kubernetes.io/projected/3baa46c8-1f0b-4b6f-95bb-94368bf6cc23-kube-api-access-l9kxv\") pod \"machine-config-controller-84d6567774-vb8fv\" (UID: \"3baa46c8-1f0b-4b6f-95bb-94368bf6cc23\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vb8fv" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.789401 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/811e9e22-1241-440a-9a6a-a6c51a0f0f7c-socket-dir\") pod \"csi-hostpathplugin-tlpgd\" (UID: \"811e9e22-1241-440a-9a6a-a6c51a0f0f7c\") " pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.792086 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ce1b285-b6aa-4361-aa9e-5274a9863b6a-cert\") pod \"ingress-canary-946wq\" (UID: \"1ce1b285-b6aa-4361-aa9e-5274a9863b6a\") " pod="openshift-ingress-canary/ingress-canary-946wq" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.792567 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1ef4b62b-1af2-4d7b-90cf-3ac35a4bf101-srv-cert\") pod \"catalog-operator-68c6474976-hb9dg\" (UID: \"1ef4b62b-1af2-4d7b-90cf-3ac35a4bf101\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hb9dg" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.792921 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae8af772-70a9-4758-b597-363c1db463ad-serving-cert\") pod \"service-ca-operator-777779d784-xc8t7\" (UID: \"ae8af772-70a9-4758-b597-363c1db463ad\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xc8t7" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.793584 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/13a5be44-f180-42a9-bff7-8ba69cc589f0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bfffb\" (UID: \"13a5be44-f180-42a9-bff7-8ba69cc589f0\") " pod="openshift-marketplace/marketplace-operator-79b997595-bfffb" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.794683 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/13a5be44-f180-42a9-bff7-8ba69cc589f0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bfffb\" (UID: \"13a5be44-f180-42a9-bff7-8ba69cc589f0\") " pod="openshift-marketplace/marketplace-operator-79b997595-bfffb" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.795956 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d2cd3da0-c3e4-461c-93ed-064c8b3b0edd-node-bootstrap-token\") pod \"machine-config-server-5bsn7\" (UID: \"d2cd3da0-c3e4-461c-93ed-064c8b3b0edd\") " pod="openshift-machine-config-operator/machine-config-server-5bsn7" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.798883 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eeffaf81-97bf-4570-b2f4-4692c4bda9ac-metrics-tls\") pod \"dns-default-4chtt\" (UID: \"eeffaf81-97bf-4570-b2f4-4692c4bda9ac\") " pod="openshift-dns/dns-default-4chtt" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.801524 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/afb516ca-988f-4b77-aea0-10cd22ce2b77-bound-sa-token\") pod \"ingress-operator-5b745b69d9-j7w5g\" (UID: \"afb516ca-988f-4b77-aea0-10cd22ce2b77\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j7w5g" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.803515 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1ef4b62b-1af2-4d7b-90cf-3ac35a4bf101-profile-collector-cert\") pod \"catalog-operator-68c6474976-hb9dg\" (UID: \"1ef4b62b-1af2-4d7b-90cf-3ac35a4bf101\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hb9dg" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.805644 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j7w5g" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.813643 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kkpm\" (UniqueName: \"kubernetes.io/projected/57ed01e7-bfeb-428e-88c2-371662581ddf-kube-api-access-6kkpm\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nlbr\" (UID: \"57ed01e7-bfeb-428e-88c2-371662581ddf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nlbr" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.830553 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57qch" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.840875 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptqw5\" (UniqueName: \"kubernetes.io/projected/c33165ce-519a-4b0e-b62a-f153d38fc14c-kube-api-access-ptqw5\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.841063 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nlbr" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.841250 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8lhkg"] Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.843373 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-t5827"] Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.843529 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-stqb9"] Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.855480 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvk22\" (UniqueName: \"kubernetes.io/projected/4dde99c9-4e15-4bdc-ba17-5d00e4117b1c-kube-api-access-tvk22\") pod \"collect-profiles-29521440-crl2x\" (UID: \"4dde99c9-4e15-4bdc-ba17-5d00e4117b1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521440-crl2x" Feb 17 00:08:04 crc kubenswrapper[4791]: W0217 00:08:04.864077 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod459f3992_b770_44d7_9ecc_0ae8a228134f.slice/crio-5b13b672e6e00e92f256e5125759e26cc636b93b2b1c7ddba5a5960ceca650a0 WatchSource:0}: Error finding container 5b13b672e6e00e92f256e5125759e26cc636b93b2b1c7ddba5a5960ceca650a0: Status 404 returned error can't find the container with id 5b13b672e6e00e92f256e5125759e26cc636b93b2b1c7ddba5a5960ceca650a0 Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.866005 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhmqq" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.877702 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vb8fv" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.881701 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: E0217 00:08:04.883784 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:05.383761226 +0000 UTC m=+142.863273823 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.918733 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkz8g\" (UniqueName: \"kubernetes.io/projected/1ce1b285-b6aa-4361-aa9e-5274a9863b6a-kube-api-access-kkz8g\") pod \"ingress-canary-946wq\" (UID: \"1ce1b285-b6aa-4361-aa9e-5274a9863b6a\") " pod="openshift-ingress-canary/ingress-canary-946wq" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.926013 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fz5j\" (UniqueName: \"kubernetes.io/projected/ae8af772-70a9-4758-b597-363c1db463ad-kube-api-access-7fz5j\") pod \"service-ca-operator-777779d784-xc8t7\" (UID: \"ae8af772-70a9-4758-b597-363c1db463ad\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xc8t7" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.935695 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l6xj\" (UniqueName: \"kubernetes.io/projected/13a5be44-f180-42a9-bff7-8ba69cc589f0-kube-api-access-2l6xj\") pod \"marketplace-operator-79b997595-bfffb\" (UID: \"13a5be44-f180-42a9-bff7-8ba69cc589f0\") " pod="openshift-marketplace/marketplace-operator-79b997595-bfffb" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.957549 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5vvm\" (UniqueName: \"kubernetes.io/projected/811e9e22-1241-440a-9a6a-a6c51a0f0f7c-kube-api-access-n5vvm\") pod \"csi-hostpathplugin-tlpgd\" (UID: \"811e9e22-1241-440a-9a6a-a6c51a0f0f7c\") " pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.959790 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fpr65"] Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.960088 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521440-crl2x" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.972422 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455"] Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.982219 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-rt865"] Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.983549 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:04 crc kubenswrapper[4791]: E0217 00:08:04.983696 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:05.483670338 +0000 UTC m=+142.963182865 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.983883 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: E0217 00:08:04.984327 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:05.484315609 +0000 UTC m=+142.963828136 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.987429 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46st9\" (UniqueName: \"kubernetes.io/projected/eeffaf81-97bf-4570-b2f4-4692c4bda9ac-kube-api-access-46st9\") pod \"dns-default-4chtt\" (UID: \"eeffaf81-97bf-4570-b2f4-4692c4bda9ac\") " pod="openshift-dns/dns-default-4chtt" Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.003375 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcqqr\" (UniqueName: \"kubernetes.io/projected/1ef4b62b-1af2-4d7b-90cf-3ac35a4bf101-kube-api-access-fcqqr\") pod \"catalog-operator-68c6474976-hb9dg\" (UID: \"1ef4b62b-1af2-4d7b-90cf-3ac35a4bf101\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hb9dg" Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.004031 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-qxk8k"] Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.008823 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xc8t7" Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.013230 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg877\" (UniqueName: \"kubernetes.io/projected/d2cd3da0-c3e4-461c-93ed-064c8b3b0edd-kube-api-access-lg877\") pod \"machine-config-server-5bsn7\" (UID: \"d2cd3da0-c3e4-461c-93ed-064c8b3b0edd\") " pod="openshift-machine-config-operator/machine-config-server-5bsn7" Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.015493 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bfffb" Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.024577 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-5bsn7" Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.049421 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.055164 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4chtt" Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.064556 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-946wq" Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.090844 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r8zpf"] Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.090945 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:05 crc kubenswrapper[4791]: E0217 00:08:05.091660 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:05.591631959 +0000 UTC m=+143.071144486 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.105474 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" event={"ID":"713c3460-f77d-4f7b-81bf-911f8f875dfe","Type":"ContainerStarted","Data":"f303e766e2c893e279f8d6e69a4b7c3a7060f8cee57e4d09d9bd789c6c1a5750"} Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.112047 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-t5827" event={"ID":"4360bf41-9e45-498e-8f94-2c43a0dc88e5","Type":"ContainerStarted","Data":"206b88e39f673e89ccffe9a9ef469f983b44a5b7b4d3a2eee4baa61d54a10cbe"} Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.116731 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-5nwz7"] Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.119777 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl"] Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.128416 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29521440-k6f7k" event={"ID":"94401a93-55c7-4e8b-83f7-dc27a876f335","Type":"ContainerStarted","Data":"1bf210069f01dcf3433075dd8a895405951d971de359016b4eb9aa868416c26a"} Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.128536 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29521440-k6f7k" event={"ID":"94401a93-55c7-4e8b-83f7-dc27a876f335","Type":"ContainerStarted","Data":"142ab3f00f5019c1be7909e2cdfb055edb77c1758b592a88fa5914ec63b0bda2"} Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.131019 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-kt8q6" event={"ID":"459f3992-b770-44d7-9ecc-0ae8a228134f","Type":"ContainerStarted","Data":"5b13b672e6e00e92f256e5125759e26cc636b93b2b1c7ddba5a5960ceca650a0"} Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.132217 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-49n75" event={"ID":"d15f89df-5eaf-48c8-963d-bc3e1c79bd43","Type":"ContainerStarted","Data":"9d6d2db44cc9f435332a1537d8dcee4fb6d883a2f1ee4560335c6594d7b3e53f"} Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.132261 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-49n75" event={"ID":"d15f89df-5eaf-48c8-963d-bc3e1c79bd43","Type":"ContainerStarted","Data":"fa86f84d9b3c927c534e66a8a8c7f06efa756a93210c273eee1c1f9021a4f2c6"} Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.133612 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-flvjk" event={"ID":"1b1913d4-85d3-4596-acea-6e272cf81e8e","Type":"ContainerStarted","Data":"dd92dbba7d84f967b1d4396b47a319688a637672b5b28895be564fb0425a3769"} Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.134443 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qzgkr" event={"ID":"5f00b345-a265-41cc-89b7-6f059fc4d5d1","Type":"ContainerStarted","Data":"34bc710b070e56fbe324763a2cc565269ad14a49a6e9ee5df8ba450137c73325"} Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.134465 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qzgkr" event={"ID":"5f00b345-a265-41cc-89b7-6f059fc4d5d1","Type":"ContainerStarted","Data":"8cc8f299454a551b55d13491c775c0a939240c5e1e34939961d1c7c555a7aeb8"} Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.135334 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-stqb9" event={"ID":"5526b957-e33f-4952-8dda-d2875c94686a","Type":"ContainerStarted","Data":"a815de6019a698c20bed21b73fbb7569a29188f85fc57d492c2463fb16dd7366"} Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.136132 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-frmbv" event={"ID":"155619c1-12ba-4149-9dce-474e3735168c","Type":"ContainerStarted","Data":"d68e68a5c791b5025cc874dfe67df8e2c5a13f25ec30257d931f822404c8bd9c"} Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.136154 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-frmbv" event={"ID":"155619c1-12ba-4149-9dce-474e3735168c","Type":"ContainerStarted","Data":"17b20743ee4b09534e8c553848de6d4017caec94554d3343f6ae0a7bdee43964"} Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.141953 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8lhkg" event={"ID":"3d469ce1-e7ed-4826-a378-0de16f2b4e56","Type":"ContainerStarted","Data":"a64135342bdd7e818a09fa15f182506bb1c697bcadc4e9fc443eca02139d19bf"} Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.143779 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d7qrh" event={"ID":"839b6744-bbe6-4b56-b020-181d86c604fe","Type":"ContainerStarted","Data":"27c89b131f2ada186ac01ff879401d8c7fcb538c8065adb433bed741f2f008f0"} Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.143803 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d7qrh" event={"ID":"839b6744-bbe6-4b56-b020-181d86c604fe","Type":"ContainerStarted","Data":"482bbf8090489513d1c9878cc86b880c98f076c8e494308a40bd3a03e946699f"} Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.143813 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d7qrh" event={"ID":"839b6744-bbe6-4b56-b020-181d86c604fe","Type":"ContainerStarted","Data":"c23842d5f2e3eef9553c2a7828b456f5c3333453e6613522a68d073ac3d56526"} Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.194516 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:05 crc kubenswrapper[4791]: E0217 00:08:05.195138 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:05.695091386 +0000 UTC m=+143.174603913 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:05 crc kubenswrapper[4791]: W0217 00:08:05.221554 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod643578b4_75ca_4765_8df5_9167688e3ced.slice/crio-7bb16cbc0b60bc99bce5603cc5d99fefa64bde3c544cd8cd956d59cf9ea83790 WatchSource:0}: Error finding container 7bb16cbc0b60bc99bce5603cc5d99fefa64bde3c544cd8cd956d59cf9ea83790: Status 404 returned error can't find the container with id 7bb16cbc0b60bc99bce5603cc5d99fefa64bde3c544cd8cd956d59cf9ea83790 Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.254168 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fpv5b"] Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.295059 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hb9dg" Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.297397 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:05 crc kubenswrapper[4791]: E0217 00:08:05.298569 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:05.798553282 +0000 UTC m=+143.278065809 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.399311 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:05 crc kubenswrapper[4791]: E0217 00:08:05.400023 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:05.900007524 +0000 UTC m=+143.379520051 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.500860 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:05 crc kubenswrapper[4791]: E0217 00:08:05.501465 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:06.001444995 +0000 UTC m=+143.480957522 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.592384 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nb5r"] Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.597066 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-n2s28"] Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.602247 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:05 crc kubenswrapper[4791]: E0217 00:08:05.602821 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:06.102808244 +0000 UTC m=+143.582320761 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.608820 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kqr99"] Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.616245 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-7pqm8"] Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.626018 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fs52s"] Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.627726 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5jsj6"] Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.698182 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bt9mf"] Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.702990 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:05 crc kubenswrapper[4791]: E0217 00:08:05.703253 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:06.203227393 +0000 UTC m=+143.682739920 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.703396 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:05 crc kubenswrapper[4791]: E0217 00:08:05.703925 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:06.203907195 +0000 UTC m=+143.683419722 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.705091 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r2rv7"] Feb 17 00:08:05 crc kubenswrapper[4791]: W0217 00:08:05.713663 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b747aa6_3874_4f71_86bb_d340398d7bc4.slice/crio-bf6c044a923c9a078ef93cd0ae0652e77b223f659eca082644bf22f9fba35dff WatchSource:0}: Error finding container bf6c044a923c9a078ef93cd0ae0652e77b223f659eca082644bf22f9fba35dff: Status 404 returned error can't find the container with id bf6c044a923c9a078ef93cd0ae0652e77b223f659eca082644bf22f9fba35dff Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.715249 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-952sm"] Feb 17 00:08:05 crc kubenswrapper[4791]: W0217 00:08:05.799383 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71967495_8841_4810_89e5_e114b9887c5e.slice/crio-87a25ad62b20ec9733ec4fbfd91d80ed15aa968b40372fc7f787917720daec60 WatchSource:0}: Error finding container 87a25ad62b20ec9733ec4fbfd91d80ed15aa968b40372fc7f787917720daec60: Status 404 returned error can't find the container with id 87a25ad62b20ec9733ec4fbfd91d80ed15aa968b40372fc7f787917720daec60 Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.804548 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:05 crc kubenswrapper[4791]: E0217 00:08:05.805294 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:06.305277155 +0000 UTC m=+143.784789682 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.866769 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-49n75" podStartSLOduration=123.866751731 podStartE2EDuration="2m3.866751731s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:05.832686876 +0000 UTC m=+143.312199413" watchObservedRunningTime="2026-02-17 00:08:05.866751731 +0000 UTC m=+143.346264258" Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.906166 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:05 crc kubenswrapper[4791]: E0217 00:08:05.906599 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:06.406588112 +0000 UTC m=+143.886100639 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.006886 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:06 crc kubenswrapper[4791]: E0217 00:08:06.006984 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:06.50696922 +0000 UTC m=+143.986481747 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.007239 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:06 crc kubenswrapper[4791]: E0217 00:08:06.008030 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:06.508020853 +0000 UTC m=+143.987533370 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.018605 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-j7w5g"] Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.024091 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57qch"] Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.035301 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nlbr"] Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.088853 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-72t6m"] Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.112240 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:06 crc kubenswrapper[4791]: E0217 00:08:06.112530 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:06.612517813 +0000 UTC m=+144.092030340 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.121719 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-vb8fv"] Feb 17 00:08:06 crc kubenswrapper[4791]: W0217 00:08:06.127890 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafb516ca_988f_4b77_aea0_10cd22ce2b77.slice/crio-c1926e7439e68bf34f98580c0dfdb9a54edefab6aa0f87ce086fac43ffb8969c WatchSource:0}: Error finding container c1926e7439e68bf34f98580c0dfdb9a54edefab6aa0f87ce086fac43ffb8969c: Status 404 returned error can't find the container with id c1926e7439e68bf34f98580c0dfdb9a54edefab6aa0f87ce086fac43ffb8969c Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.136018 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-pruner-29521440-k6f7k" podStartSLOduration=124.135995498 podStartE2EDuration="2m4.135995498s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:06.124599902 +0000 UTC m=+143.604112429" watchObservedRunningTime="2026-02-17 00:08:06.135995498 +0000 UTC m=+143.615508015" Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.139027 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521440-crl2x"] Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.190520 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-946wq"] Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.194097 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhmqq"] Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.196444 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4chtt"] Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.202591 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bfffb"] Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.213369 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:06 crc kubenswrapper[4791]: E0217 00:08:06.213755 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:06.713739398 +0000 UTC m=+144.193251925 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.240475 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-tlpgd"] Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.241409 4791 generic.go:334] "Generic (PLEG): container finished" podID="1b1913d4-85d3-4596-acea-6e272cf81e8e" containerID="40fb24938620be8b1d416e78d154ed948754865f026cb8ac47e2f7cdff5ab937" exitCode=0 Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.241471 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-flvjk" event={"ID":"1b1913d4-85d3-4596-acea-6e272cf81e8e","Type":"ContainerDied","Data":"40fb24938620be8b1d416e78d154ed948754865f026cb8ac47e2f7cdff5ab937"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.252769 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fpv5b" event={"ID":"5e2cf0a7-86f6-4858-98ad-08c4c644deb9","Type":"ContainerStarted","Data":"95328a6e95fda24b2e97b8fc71783b3f7f2bbed00ffdfd64eb7ca9514a883552"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.253150 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fpv5b" event={"ID":"5e2cf0a7-86f6-4858-98ad-08c4c644deb9","Type":"ContainerStarted","Data":"4a8e89134aa3ad074635175763043d317ffc6ed9e2fc49c3f8ca922783b89cf5"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.258743 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-5jsj6" event={"ID":"9578978b-522d-48d8-9b08-384752fc49a1","Type":"ContainerStarted","Data":"7f331e409b51489b981cf18efb64a20186608a5eff470f354f95c51f4220dbe4"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.261899 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nb5r" event={"ID":"0b747aa6-3874-4f71-86bb-d340398d7bc4","Type":"ContainerStarted","Data":"bf6c044a923c9a078ef93cd0ae0652e77b223f659eca082644bf22f9fba35dff"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.274699 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n2s28" event={"ID":"fe44c059-87ef-4805-b78f-b8c3cdfd844e","Type":"ContainerStarted","Data":"5834b63100c93cd9f5d7ab15fdceb720d8fe47ee8d67789995b9fafee6f966ba"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.278493 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-rt865" event={"ID":"0522c983-dae6-41ca-807a-ff45912a0024","Type":"ContainerStarted","Data":"75e7a03529d69627bae6599e10cd571ed8062d4194733ab6148d56f13ab20292"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.278525 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-rt865" event={"ID":"0522c983-dae6-41ca-807a-ff45912a0024","Type":"ContainerStarted","Data":"a66362d342c250fa63e0df4225566f96b7e347652c5e248e46f436271cd8a461"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.278765 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-rt865" Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.279968 4791 patch_prober.go:28] interesting pod/downloads-7954f5f757-rt865 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.280045 4791 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-rt865" podUID="0522c983-dae6-41ca-807a-ff45912a0024" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.286794 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" event={"ID":"6a866a69-9159-4dd1-a03d-b2a0f703fb7b","Type":"ContainerStarted","Data":"d077feb7a29e2e612d7324e3f4804db6415f0167d613f565b60c6d0f2bcce8f4"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.286837 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" event={"ID":"6a866a69-9159-4dd1-a03d-b2a0f703fb7b","Type":"ContainerStarted","Data":"0c85ff67a65174e4212f77cdeae113e56a44995c48f0f9d56c7ed9adda3bd480"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.287793 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.289520 4791 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-ht455 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.289565 4791 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" podUID="6a866a69-9159-4dd1-a03d-b2a0f703fb7b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.292966 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hb9dg"] Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.293526 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qzgkr" event={"ID":"5f00b345-a265-41cc-89b7-6f059fc4d5d1","Type":"ContainerStarted","Data":"b7a8b18536fa6767c60f309e08d47207638d465f657bf2bdbc7d3f33f8420cdd"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.295952 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-5nwz7" event={"ID":"9c752f56-7754-4718-aea5-cb41d6ac4253","Type":"ContainerStarted","Data":"ce9274f47cae1c20fec77be0219875543af87f8c562bbea7299d31cef267add7"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.295992 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-5nwz7" event={"ID":"9c752f56-7754-4718-aea5-cb41d6ac4253","Type":"ContainerStarted","Data":"959c67ba56cd2f54f4dff7177cf5df6ac41f5fff7c2ee1fbec83484cb2387fd7"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.297179 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fs52s" event={"ID":"e3032312-913c-4072-ac18-56fdc689cbac","Type":"ContainerStarted","Data":"6b5d87aecd7ce403be0e26e7def9943ae61b90ac529666b24e43dee5fa092938"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.297989 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57qch" event={"ID":"8543e6a7-7bb0-4a35-96c5-bcae0763cc78","Type":"ContainerStarted","Data":"2477c6f12128d4912d9d839f8fff599ccaef2a4115288be43d6389a6b9f92bb1"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.299318 4791 generic.go:334] "Generic (PLEG): container finished" podID="643578b4-75ca-4765-8df5-9167688e3ced" containerID="caa80972cddb40983cf80f6f3371bdce39531bf5d3a8e10cac13ecf851412e29" exitCode=0 Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.299363 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" event={"ID":"643578b4-75ca-4765-8df5-9167688e3ced","Type":"ContainerDied","Data":"caa80972cddb40983cf80f6f3371bdce39531bf5d3a8e10cac13ecf851412e29"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.300305 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" event={"ID":"643578b4-75ca-4765-8df5-9167688e3ced","Type":"ContainerStarted","Data":"7bb16cbc0b60bc99bce5603cc5d99fefa64bde3c544cd8cd956d59cf9ea83790"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.301221 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kqr99" event={"ID":"03d7a8df-a8a3-4b34-bd28-d554ae70875a","Type":"ContainerStarted","Data":"5b641221af4812862b561709d03b916acf859293fea57d3885afe67acc85e731"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.303270 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-t5827" event={"ID":"4360bf41-9e45-498e-8f94-2c43a0dc88e5","Type":"ContainerStarted","Data":"b64b8ec32795b5b45afa67cf013bcbd40ce1d7f9297fbba36a4eaaef9064bd6d"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.304224 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-t5827" Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.306309 4791 patch_prober.go:28] interesting pod/console-operator-58897d9998-t5827 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.306347 4791 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-t5827" podUID="4360bf41-9e45-498e-8f94-2c43a0dc88e5" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.314844 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:06 crc kubenswrapper[4791]: E0217 00:08:06.315276 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:06.815254732 +0000 UTC m=+144.294767259 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.316491 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8lhkg" event={"ID":"3d469ce1-e7ed-4826-a378-0de16f2b4e56","Type":"ContainerStarted","Data":"98ec5fd16ca8e4ce986234c99077ede912bf552eaab0db59db3d4daa61506b98"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.321750 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fpr65" event={"ID":"b7f0fa93-740f-43aa-9350-24d9920a9345","Type":"ContainerStarted","Data":"17ee9158e3fdfd68777a3f05d407f424eb25dc139182fc00aacd2bde56bdfe08"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.321784 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fpr65" event={"ID":"b7f0fa93-740f-43aa-9350-24d9920a9345","Type":"ContainerStarted","Data":"ebd3e1d0b20d4e9d74101c27aed6853f5a3c379ff19f48e52abf9f9e0ca45145"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.324532 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d7qrh" podStartSLOduration=124.324518499 podStartE2EDuration="2m4.324518499s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:06.309415934 +0000 UTC m=+143.788928461" watchObservedRunningTime="2026-02-17 00:08:06.324518499 +0000 UTC m=+143.804031026" Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.325925 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-xc8t7"] Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.326234 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-952sm" event={"ID":"c5fb65f7-7cc6-4834-853e-a91eebc956fd","Type":"ContainerStarted","Data":"4db7223c621dad4ca4a2cf4fdc79db21ea445ff73894457d60af10a67d532fa1"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.334216 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" event={"ID":"c70fe9d3-348d-4bb8-89f7-21027041131a","Type":"ContainerStarted","Data":"8f95a285c9f42fc1af78c5ca2e9c6d3afa217abceec616519705df06ca48c8b7"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.334263 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" event={"ID":"c70fe9d3-348d-4bb8-89f7-21027041131a","Type":"ContainerStarted","Data":"bc92c2848641b389e30636fc84dbaa434604ffad03bf817e464c4e944f11c4ea"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.334386 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.336483 4791 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-r8zpf container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.25:6443/healthz\": dial tcp 10.217.0.25:6443: connect: connection refused" start-of-body= Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.336515 4791 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" podUID="c70fe9d3-348d-4bb8-89f7-21027041131a" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.25:6443/healthz\": dial tcp 10.217.0.25:6443: connect: connection refused" Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.337512 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-stqb9" event={"ID":"5526b957-e33f-4952-8dda-d2875c94686a","Type":"ContainerStarted","Data":"db531583ab8b781dd61dfe3d2878986ca0106d031f65f34ed6d98ea59985f958"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.343091 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-5bsn7" event={"ID":"d2cd3da0-c3e4-461c-93ed-064c8b3b0edd","Type":"ContainerStarted","Data":"4f48b0b1fc00d23e7619acb357b92513cac2e30b3d59e85597d23f892fcaa983"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.343183 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-5bsn7" event={"ID":"d2cd3da0-c3e4-461c-93ed-064c8b3b0edd","Type":"ContainerStarted","Data":"1658775be87da06af96b858aee3c59d0f774214e3ca4092a0f764a8ff07f1081"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.347965 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxk8k" event={"ID":"ad62ba3d-c60a-4e1f-9768-187e74151f24","Type":"ContainerStarted","Data":"5bbd0db20373a7441f5bb6d421e3277a5d8b5d7d851ff90580117407daeb995a"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.348003 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxk8k" event={"ID":"ad62ba3d-c60a-4e1f-9768-187e74151f24","Type":"ContainerStarted","Data":"426d716b18b9a71736268368767e3928361e1d1bb985b12b162b76f2351d832b"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.353982 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-kt8q6" event={"ID":"459f3992-b770-44d7-9ecc-0ae8a228134f","Type":"ContainerStarted","Data":"b7a95c7efddf0dd9f9591d69cbb3badc012e04c94fd97824a044926e6b6f6c9a"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.362194 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" event={"ID":"713c3460-f77d-4f7b-81bf-911f8f875dfe","Type":"ContainerStarted","Data":"eccdaaec958face5d5dcfd6c5fddbc0b4b13a69d1c98503b7abb4bafa6bcd4bc"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.362888 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" Feb 17 00:08:06 crc kubenswrapper[4791]: W0217 00:08:06.364000 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae8af772_70a9_4758_b597_363c1db463ad.slice/crio-1f80b66be0bf331fa0415fc5c02f9f102805b77c52a528b4f2d720eb3e6a3d92 WatchSource:0}: Error finding container 1f80b66be0bf331fa0415fc5c02f9f102805b77c52a528b4f2d720eb3e6a3d92: Status 404 returned error can't find the container with id 1f80b66be0bf331fa0415fc5c02f9f102805b77c52a528b4f2d720eb3e6a3d92 Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.368163 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bt9mf" event={"ID":"b6c19ecc-0208-46de-8c03-6780bba30353","Type":"ContainerStarted","Data":"69b342b3e6d027d571b603d15930a15b414ae48ff61d21acef0575eabb9dfc2d"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.369856 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j7w5g" event={"ID":"afb516ca-988f-4b77-aea0-10cd22ce2b77","Type":"ContainerStarted","Data":"c1926e7439e68bf34f98580c0dfdb9a54edefab6aa0f87ce086fac43ffb8969c"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.370738 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-7pqm8" event={"ID":"c50f96b4-3a86-4edc-b9d5-82fe3181b8a4","Type":"ContainerStarted","Data":"5fbb286dd9c23a26eccebb86c4ca61f9def0db6f356bb9410711a9d2ce0c4e86"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.372029 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r2rv7" event={"ID":"71967495-8841-4810-89e5-e114b9887c5e","Type":"ContainerStarted","Data":"87a25ad62b20ec9733ec4fbfd91d80ed15aa968b40372fc7f787917720daec60"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.375300 4791 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-jftdn container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.375372 4791 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" podUID="713c3460-f77d-4f7b-81bf-911f8f875dfe" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.416958 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:06 crc kubenswrapper[4791]: E0217 00:08:06.421369 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:06.921354173 +0000 UTC m=+144.400866700 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.518182 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:06 crc kubenswrapper[4791]: E0217 00:08:06.518458 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:07.018422034 +0000 UTC m=+144.497934571 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.518795 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:06 crc kubenswrapper[4791]: E0217 00:08:06.519541 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:07.01953021 +0000 UTC m=+144.499042737 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.620694 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:06 crc kubenswrapper[4791]: E0217 00:08:06.620998 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:07.120985561 +0000 UTC m=+144.600498088 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.678585 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.704962 4791 patch_prober.go:28] interesting pod/router-default-5444994796-kt8q6 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.705010 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kt8q6" podUID="459f3992-b770-44d7-9ecc-0ae8a228134f" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.710745 4791 csr.go:261] certificate signing request csr-r7wvn is approved, waiting to be issued Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.718738 4791 csr.go:257] certificate signing request csr-r7wvn is issued Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.721704 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-frmbv" podStartSLOduration=124.721683349 podStartE2EDuration="2m4.721683349s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:06.71796447 +0000 UTC m=+144.197476997" watchObservedRunningTime="2026-02-17 00:08:06.721683349 +0000 UTC m=+144.201195876" Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.722295 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:06 crc kubenswrapper[4791]: E0217 00:08:06.722617 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:07.222607199 +0000 UTC m=+144.702119726 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.823279 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:06 crc kubenswrapper[4791]: E0217 00:08:06.826251 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:07.32623169 +0000 UTC m=+144.805744217 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.927263 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:06 crc kubenswrapper[4791]: E0217 00:08:06.927581 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:07.427569729 +0000 UTC m=+144.907082256 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:06.999227 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" podStartSLOduration=124.999207971 podStartE2EDuration="2m4.999207971s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:06.957604184 +0000 UTC m=+144.437116711" watchObservedRunningTime="2026-02-17 00:08:06.999207971 +0000 UTC m=+144.478720498" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.028899 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:07 crc kubenswrapper[4791]: E0217 00:08:07.029401 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:07.529382361 +0000 UTC m=+145.008894898 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.032722 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" podStartSLOduration=124.032703718 podStartE2EDuration="2m4.032703718s" podCreationTimestamp="2026-02-17 00:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:06.999722758 +0000 UTC m=+144.479235285" watchObservedRunningTime="2026-02-17 00:08:07.032703718 +0000 UTC m=+144.512216255" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.082408 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-stqb9" podStartSLOduration=125.082374016 podStartE2EDuration="2m5.082374016s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:07.037039078 +0000 UTC m=+144.516551605" watchObservedRunningTime="2026-02-17 00:08:07.082374016 +0000 UTC m=+144.561886543" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.121023 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8lhkg" podStartSLOduration=125.120995047 podStartE2EDuration="2m5.120995047s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:07.120380107 +0000 UTC m=+144.599892644" watchObservedRunningTime="2026-02-17 00:08:07.120995047 +0000 UTC m=+144.600507574" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.122614 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" podStartSLOduration=125.122605858 podStartE2EDuration="2m5.122605858s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:07.084334189 +0000 UTC m=+144.563846716" watchObservedRunningTime="2026-02-17 00:08:07.122605858 +0000 UTC m=+144.602118385" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.133533 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:07 crc kubenswrapper[4791]: E0217 00:08:07.134222 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:07.634207711 +0000 UTC m=+145.113720238 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.191823 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qzgkr" podStartSLOduration=125.191799243 podStartE2EDuration="2m5.191799243s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:07.19136697 +0000 UTC m=+144.670879497" watchObservedRunningTime="2026-02-17 00:08:07.191799243 +0000 UTC m=+144.671311770" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.197919 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-t5827" podStartSLOduration=125.197900449 podStartE2EDuration="2m5.197900449s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:07.164458734 +0000 UTC m=+144.643971271" watchObservedRunningTime="2026-02-17 00:08:07.197900449 +0000 UTC m=+144.677412976" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.239661 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:07 crc kubenswrapper[4791]: E0217 00:08:07.240615 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:07.739800876 +0000 UTC m=+145.219313403 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.240906 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:07 crc kubenswrapper[4791]: E0217 00:08:07.241194 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:07.741186721 +0000 UTC m=+145.220699248 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.281029 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fpr65" podStartSLOduration=125.281010352 podStartE2EDuration="2m5.281010352s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:07.280433133 +0000 UTC m=+144.759945660" watchObservedRunningTime="2026-02-17 00:08:07.281010352 +0000 UTC m=+144.760522879" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.341552 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:07 crc kubenswrapper[4791]: E0217 00:08:07.342089 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:07.842075895 +0000 UTC m=+145.321588422 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.367496 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-kt8q6" podStartSLOduration=124.367475351 podStartE2EDuration="2m4.367475351s" podCreationTimestamp="2026-02-17 00:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:07.320711117 +0000 UTC m=+144.800223644" watchObservedRunningTime="2026-02-17 00:08:07.367475351 +0000 UTC m=+144.846987878" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.390807 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fpv5b" podStartSLOduration=124.39078679 podStartE2EDuration="2m4.39078679s" podCreationTimestamp="2026-02-17 00:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:07.388997223 +0000 UTC m=+144.868509750" watchObservedRunningTime="2026-02-17 00:08:07.39078679 +0000 UTC m=+144.870299317" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.442545 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nlbr" event={"ID":"57ed01e7-bfeb-428e-88c2-371662581ddf","Type":"ContainerStarted","Data":"47db9cbaac3ec07815f8620ce83195c7b16b1455be8613b9e97946b41fce37d2"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.442602 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nlbr" event={"ID":"57ed01e7-bfeb-428e-88c2-371662581ddf","Type":"ContainerStarted","Data":"2c1890499027e2938c74ddde9e2725343a834c43969364e8ddfd9209318e0174"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.445182 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:07 crc kubenswrapper[4791]: E0217 00:08:07.447410 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:07.947392781 +0000 UTC m=+145.426905318 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.448561 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bfffb" event={"ID":"13a5be44-f180-42a9-bff7-8ba69cc589f0","Type":"ContainerStarted","Data":"1fbad1f45455de495c7be98f092d45f87a92cd56249ef29cc851a38fc824a00f"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.448598 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bfffb" event={"ID":"13a5be44-f180-42a9-bff7-8ba69cc589f0","Type":"ContainerStarted","Data":"f1a3439d45cbb877a9cdb806affb8d5e0982a3ff436258b9fc60b97b89a3ef01"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.449253 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-bfffb" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.455661 4791 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-bfffb container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.43:8080/healthz\": dial tcp 10.217.0.43:8080: connect: connection refused" start-of-body= Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.455718 4791 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-bfffb" podUID="13a5be44-f180-42a9-bff7-8ba69cc589f0" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.43:8080/healthz\": dial tcp 10.217.0.43:8080: connect: connection refused" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.457566 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-952sm" event={"ID":"c5fb65f7-7cc6-4834-853e-a91eebc956fd","Type":"ContainerStarted","Data":"81b427b7573e4cf9cfbb7bccb1f44c300b9035efb6402caed9a3e5c28a75be2c"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.457692 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-952sm" event={"ID":"c5fb65f7-7cc6-4834-853e-a91eebc956fd","Type":"ContainerStarted","Data":"b5070e857243a46a8b6ec53eb4bd092d26e78b3683e50f8c6f17a30b85de8f06"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.464677 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-rt865" podStartSLOduration=125.464660796 podStartE2EDuration="2m5.464660796s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:07.436080867 +0000 UTC m=+144.915593394" watchObservedRunningTime="2026-02-17 00:08:07.464660796 +0000 UTC m=+144.944173323" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.465370 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-5bsn7" podStartSLOduration=6.465362809 podStartE2EDuration="6.465362809s" podCreationTimestamp="2026-02-17 00:08:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:07.462973022 +0000 UTC m=+144.942485549" watchObservedRunningTime="2026-02-17 00:08:07.465362809 +0000 UTC m=+144.944875336" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.468966 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j7w5g" event={"ID":"afb516ca-988f-4b77-aea0-10cd22ce2b77","Type":"ContainerStarted","Data":"4cb4c39123a8be97c43cf0e8daee53a169b6ed27a15a587373572b7e66c5f434"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.478246 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bt9mf" event={"ID":"b6c19ecc-0208-46de-8c03-6780bba30353","Type":"ContainerStarted","Data":"88b8363acadb5b09be62aa6d3adc4a7a5a77f1c9201ddd017a5c9eafc8b56850"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.478786 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bt9mf" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.480919 4791 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-bt9mf container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:5443/healthz\": dial tcp 10.217.0.31:5443: connect: connection refused" start-of-body= Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.480955 4791 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bt9mf" podUID="b6c19ecc-0208-46de-8c03-6780bba30353" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.31:5443/healthz\": dial tcp 10.217.0.31:5443: connect: connection refused" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.481829 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-72t6m" event={"ID":"ee8894f1-34ac-4df2-bf1a-01e1a110d6c9","Type":"ContainerStarted","Data":"adccb50d3753547360cab0c817bc0d7632a6d724c3247fc54920b604edd01bb6"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.481859 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-72t6m" event={"ID":"ee8894f1-34ac-4df2-bf1a-01e1a110d6c9","Type":"ContainerStarted","Data":"91b7f18e5cbfceaf615e5d9ab87af73003d95c6a41afd231c4fdf2400c98cb75"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.486971 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-7pqm8" event={"ID":"c50f96b4-3a86-4edc-b9d5-82fe3181b8a4","Type":"ContainerStarted","Data":"8947c56ad1abf6d6fce581ea563cbdab0f0532d6619fdfa98830e8a9e3f4a72c"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.493365 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r2rv7" event={"ID":"71967495-8841-4810-89e5-e114b9887c5e","Type":"ContainerStarted","Data":"8e11fc249b980b74afafbdb667e9dd6da9a26fff5a79d4df56b4c2561adf95c2"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.493419 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r2rv7" event={"ID":"71967495-8841-4810-89e5-e114b9887c5e","Type":"ContainerStarted","Data":"4a592c72a5c70f3b607d8355518fa2a61e41d24c5046c8917097386958e68595"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.494209 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r2rv7" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.497552 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nb5r" event={"ID":"0b747aa6-3874-4f71-86bb-d340398d7bc4","Type":"ContainerStarted","Data":"2c221fab134d5bb1527c120a90943844b5e7c53ff11f82640c354dffd9e48720"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.500543 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-5nwz7" event={"ID":"9c752f56-7754-4718-aea5-cb41d6ac4253","Type":"ContainerStarted","Data":"fd12d2709dab0233fec62f39ff499f32b9c663664b15a82880186278c4b6e39c"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.502995 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhmqq" event={"ID":"7bf3a6ce-9a2b-49cc-9360-f552988f2b38","Type":"ContainerStarted","Data":"7dc5812277568e4be7c6d93c6c7690e193a15bbfb1d865aabdbab9ae66fc01f9"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.503022 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhmqq" event={"ID":"7bf3a6ce-9a2b-49cc-9360-f552988f2b38","Type":"ContainerStarted","Data":"2154b5e50cacaf67ad92bfbf591488e9a51c3d8f281cddf4f2860670bbd04712"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.504319 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhmqq" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.505704 4791 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-dhmqq container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.505757 4791 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhmqq" podUID="7bf3a6ce-9a2b-49cc-9360-f552988f2b38" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.521472 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nlbr" podStartSLOduration=125.521458552 podStartE2EDuration="2m5.521458552s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:07.520967076 +0000 UTC m=+145.000479603" watchObservedRunningTime="2026-02-17 00:08:07.521458552 +0000 UTC m=+145.000971079" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.550047 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:07 crc kubenswrapper[4791]: E0217 00:08:07.551480 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:08.051449816 +0000 UTC m=+145.530962423 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.562642 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521440-crl2x" event={"ID":"4dde99c9-4e15-4bdc-ba17-5d00e4117b1c","Type":"ContainerStarted","Data":"891df08438b436b0b07d8427461e11fc0c9eb7e638150952cf235c2137b716b6"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.563006 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521440-crl2x" event={"ID":"4dde99c9-4e15-4bdc-ba17-5d00e4117b1c","Type":"ContainerStarted","Data":"a39ba9373a1c8330d92d717ecb292520ee487332b22cd14b0e5fe57ebfb54ebe"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.564439 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-bfffb" podStartSLOduration=124.564417943 podStartE2EDuration="2m4.564417943s" podCreationTimestamp="2026-02-17 00:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:07.563897827 +0000 UTC m=+145.043410344" watchObservedRunningTime="2026-02-17 00:08:07.564417943 +0000 UTC m=+145.043930470" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.580406 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-5jsj6" event={"ID":"9578978b-522d-48d8-9b08-384752fc49a1","Type":"ContainerStarted","Data":"73dce2445581148c9aaac0ae8e3c3141a42b2fd397bc6713d36dce2b9e9f8734"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.594209 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4chtt" event={"ID":"eeffaf81-97bf-4570-b2f4-4692c4bda9ac","Type":"ContainerStarted","Data":"465e4b0a5eabb9bcd00a642ab6edbddbf334af705cc21d7f8cc83097c19b70c9"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.597004 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-952sm" podStartSLOduration=124.59698501 podStartE2EDuration="2m4.59698501s" podCreationTimestamp="2026-02-17 00:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:07.593667484 +0000 UTC m=+145.073180011" watchObservedRunningTime="2026-02-17 00:08:07.59698501 +0000 UTC m=+145.076497537" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.617019 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57qch" event={"ID":"8543e6a7-7bb0-4a35-96c5-bcae0763cc78","Type":"ContainerStarted","Data":"397616aa3db0a71bad5937164d93b4b861dc501caa61b23b59368d98023799d5"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.623695 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-flvjk" event={"ID":"1b1913d4-85d3-4596-acea-6e272cf81e8e","Type":"ContainerStarted","Data":"d4e158158a9ecd782e740e08c84c8fed9941911050983e345a832d4aaf89547b"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.645675 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-5nwz7" podStartSLOduration=124.645654255 podStartE2EDuration="2m4.645654255s" podCreationTimestamp="2026-02-17 00:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:07.642505983 +0000 UTC m=+145.122018520" watchObservedRunningTime="2026-02-17 00:08:07.645654255 +0000 UTC m=+145.125166782" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.652059 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.657823 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vb8fv" event={"ID":"3baa46c8-1f0b-4b6f-95bb-94368bf6cc23","Type":"ContainerStarted","Data":"b94bcc5ae36e521ca026383a0c5e72a801fc0341a5b3ffe868661eb976b4bc13"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.657868 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vb8fv" event={"ID":"3baa46c8-1f0b-4b6f-95bb-94368bf6cc23","Type":"ContainerStarted","Data":"19d706732e0f8e2555cd52821bb9ca106e51d445f5f15e844321a815eb6e1524"} Feb 17 00:08:07 crc kubenswrapper[4791]: E0217 00:08:07.658542 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:08.158522529 +0000 UTC m=+145.638035056 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.690288 4791 patch_prober.go:28] interesting pod/router-default-5444994796-kt8q6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 00:08:07 crc kubenswrapper[4791]: [-]has-synced failed: reason withheld Feb 17 00:08:07 crc kubenswrapper[4791]: [+]process-running ok Feb 17 00:08:07 crc kubenswrapper[4791]: healthz check failed Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.690340 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kt8q6" podUID="459f3992-b770-44d7-9ecc-0ae8a228134f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.711437 4791 generic.go:334] "Generic (PLEG): container finished" podID="ad62ba3d-c60a-4e1f-9768-187e74151f24" containerID="5bbd0db20373a7441f5bb6d421e3277a5d8b5d7d851ff90580117407daeb995a" exitCode=0 Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.711527 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxk8k" event={"ID":"ad62ba3d-c60a-4e1f-9768-187e74151f24","Type":"ContainerDied","Data":"5bbd0db20373a7441f5bb6d421e3277a5d8b5d7d851ff90580117407daeb995a"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.712108 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxk8k" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.714190 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhmqq" podStartSLOduration=124.714178368 podStartE2EDuration="2m4.714178368s" podCreationTimestamp="2026-02-17 00:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:07.684937708 +0000 UTC m=+145.164450255" watchObservedRunningTime="2026-02-17 00:08:07.714178368 +0000 UTC m=+145.193690895" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.726061 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-17 00:03:06 +0000 UTC, rotation deadline is 2026-12-16 14:11:36.826202543 +0000 UTC Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.726100 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7262h3m29.100104771s for next certificate rotation Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.757011 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:07 crc kubenswrapper[4791]: E0217 00:08:07.768559 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:08.268528355 +0000 UTC m=+145.748040882 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.782315 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nb5r" podStartSLOduration=124.782296259 podStartE2EDuration="2m4.782296259s" podCreationTimestamp="2026-02-17 00:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:07.781688229 +0000 UTC m=+145.261200756" watchObservedRunningTime="2026-02-17 00:08:07.782296259 +0000 UTC m=+145.261808786" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.784197 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-5jsj6" podStartSLOduration=124.784186209 podStartE2EDuration="2m4.784186209s" podCreationTimestamp="2026-02-17 00:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:07.713189846 +0000 UTC m=+145.192702373" watchObservedRunningTime="2026-02-17 00:08:07.784186209 +0000 UTC m=+145.263698736" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.804061 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" event={"ID":"643578b4-75ca-4765-8df5-9167688e3ced","Type":"ContainerStarted","Data":"62f51fb9424beb412ed9ef02bcaf7076ed9292d3aa6ee47b9a982216b2afd220"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.841273 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bt9mf" podStartSLOduration=124.841255384 podStartE2EDuration="2m4.841255384s" podCreationTimestamp="2026-02-17 00:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:07.839058123 +0000 UTC m=+145.318570660" watchObservedRunningTime="2026-02-17 00:08:07.841255384 +0000 UTC m=+145.320767911" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.842707 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-946wq" event={"ID":"1ce1b285-b6aa-4361-aa9e-5274a9863b6a","Type":"ContainerStarted","Data":"907f113a9eb0cb9413583d679214fec1c783c5575065ad366e0da999ba69a20a"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.842861 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-946wq" event={"ID":"1ce1b285-b6aa-4361-aa9e-5274a9863b6a","Type":"ContainerStarted","Data":"4186455468f535264d84e387847aaf3597f1545f4f7b77b7e46747846bfb16be"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.860051 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:07 crc kubenswrapper[4791]: E0217 00:08:07.860432 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:08.360421371 +0000 UTC m=+145.839933898 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.872160 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n2s28" event={"ID":"fe44c059-87ef-4805-b78f-b8c3cdfd844e","Type":"ContainerStarted","Data":"d862b93abaa637b5ab55ebf1f661d266f0d2f4eb72a89f6d98beacebc81b9836"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.885343 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fs52s" event={"ID":"e3032312-913c-4072-ac18-56fdc689cbac","Type":"ContainerStarted","Data":"75ef4aa7afdc141b6e36cf7b02be25daeea88d141e2cdf4b633c01c8c4a99163"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.896017 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" event={"ID":"811e9e22-1241-440a-9a6a-a6c51a0f0f7c","Type":"ContainerStarted","Data":"8e87bff544ec51acf1da27e383e4d3a8706325f1b9fe8906f2ac798c476888bc"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.904188 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r2rv7" podStartSLOduration=124.904173007 podStartE2EDuration="2m4.904173007s" podCreationTimestamp="2026-02-17 00:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:07.90085212 +0000 UTC m=+145.380364647" watchObservedRunningTime="2026-02-17 00:08:07.904173007 +0000 UTC m=+145.383685534" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.905794 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29521440-crl2x" podStartSLOduration=125.905784988 podStartE2EDuration="2m5.905784988s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:07.883312596 +0000 UTC m=+145.362825123" watchObservedRunningTime="2026-02-17 00:08:07.905784988 +0000 UTC m=+145.385297515" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.910642 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hb9dg" event={"ID":"1ef4b62b-1af2-4d7b-90cf-3ac35a4bf101","Type":"ContainerStarted","Data":"7417c148613956dc9e2a33a3c1c857f51db890058385f1ed9572da09e498bcfe"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.910745 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hb9dg" event={"ID":"1ef4b62b-1af2-4d7b-90cf-3ac35a4bf101","Type":"ContainerStarted","Data":"cc582c53139208e9ff16dceeec5c6852aa1e467a790f13ee77acbbaad464c1e0"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.911446 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hb9dg" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.916197 4791 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-hb9dg container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" start-of-body= Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.916252 4791 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hb9dg" podUID="1ef4b62b-1af2-4d7b-90cf-3ac35a4bf101" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.935263 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kqr99" event={"ID":"03d7a8df-a8a3-4b34-bd28-d554ae70875a","Type":"ContainerStarted","Data":"87d0f65c46e40821190dfe62fb590339519d2031f9ca103c8cef0bd4136479e1"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.960741 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:07 crc kubenswrapper[4791]: E0217 00:08:07.962394 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:08.462379318 +0000 UTC m=+145.941891835 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.964355 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-946wq" podStartSLOduration=6.964339011 podStartE2EDuration="6.964339011s" podCreationTimestamp="2026-02-17 00:08:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:07.927342702 +0000 UTC m=+145.406855229" watchObservedRunningTime="2026-02-17 00:08:07.964339011 +0000 UTC m=+145.443851538" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.965917 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n2s28" podStartSLOduration=124.965910252 podStartE2EDuration="2m4.965910252s" podCreationTimestamp="2026-02-17 00:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:07.962913096 +0000 UTC m=+145.442425613" watchObservedRunningTime="2026-02-17 00:08:07.965910252 +0000 UTC m=+145.445422779" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.976282 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xc8t7" event={"ID":"ae8af772-70a9-4758-b597-363c1db463ad","Type":"ContainerStarted","Data":"b5e8825abe9a3565f618a3e4c2364368a7b0c7552d26b7768a158f2c23435e88"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.976317 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xc8t7" event={"ID":"ae8af772-70a9-4758-b597-363c1db463ad","Type":"ContainerStarted","Data":"1f80b66be0bf331fa0415fc5c02f9f102805b77c52a528b4f2d720eb3e6a3d92"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.978859 4791 patch_prober.go:28] interesting pod/downloads-7954f5f757-rt865 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.978920 4791 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-rt865" podUID="0522c983-dae6-41ca-807a-ff45912a0024" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.979366 4791 patch_prober.go:28] interesting pod/console-operator-58897d9998-t5827 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.979402 4791 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-t5827" podUID="4360bf41-9e45-498e-8f94-2c43a0dc88e5" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.992419 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.033655 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxk8k" podStartSLOduration=126.03363591 podStartE2EDuration="2m6.03363591s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:08.001165195 +0000 UTC m=+145.480677742" watchObservedRunningTime="2026-02-17 00:08:08.03363591 +0000 UTC m=+145.513148437" Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.035328 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57qch" podStartSLOduration=125.035323324 podStartE2EDuration="2m5.035323324s" podCreationTimestamp="2026-02-17 00:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:08.031837651 +0000 UTC m=+145.511350178" watchObservedRunningTime="2026-02-17 00:08:08.035323324 +0000 UTC m=+145.514835851" Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.064376 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:08 crc kubenswrapper[4791]: E0217 00:08:08.075511 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:08.575492565 +0000 UTC m=+146.055005192 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.087909 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fs52s" podStartSLOduration=125.087890773 podStartE2EDuration="2m5.087890773s" podCreationTimestamp="2026-02-17 00:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:08.087515591 +0000 UTC m=+145.567028118" watchObservedRunningTime="2026-02-17 00:08:08.087890773 +0000 UTC m=+145.567403290" Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.118501 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.139805 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" podStartSLOduration=125.139791593 podStartE2EDuration="2m5.139791593s" podCreationTimestamp="2026-02-17 00:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:08.137262242 +0000 UTC m=+145.616774769" watchObservedRunningTime="2026-02-17 00:08:08.139791593 +0000 UTC m=+145.619304120" Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.170553 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:08 crc kubenswrapper[4791]: E0217 00:08:08.171039 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:08.671025057 +0000 UTC m=+146.150537584 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.221769 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kqr99" podStartSLOduration=125.221749498 podStartE2EDuration="2m5.221749498s" podCreationTimestamp="2026-02-17 00:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:08.201065323 +0000 UTC m=+145.680577850" watchObservedRunningTime="2026-02-17 00:08:08.221749498 +0000 UTC m=+145.701262025" Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.222945 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vb8fv" podStartSLOduration=125.222939226 podStartE2EDuration="2m5.222939226s" podCreationTimestamp="2026-02-17 00:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:08.160277521 +0000 UTC m=+145.639790048" watchObservedRunningTime="2026-02-17 00:08:08.222939226 +0000 UTC m=+145.702451753" Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.272793 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:08 crc kubenswrapper[4791]: E0217 00:08:08.273096 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:08.773083898 +0000 UTC m=+146.252596415 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.348761 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hb9dg" podStartSLOduration=125.348740981 podStartE2EDuration="2m5.348740981s" podCreationTimestamp="2026-02-17 00:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:08.347354727 +0000 UTC m=+145.826867244" watchObservedRunningTime="2026-02-17 00:08:08.348740981 +0000 UTC m=+145.828253508" Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.375624 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:08 crc kubenswrapper[4791]: E0217 00:08:08.375776 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:08.87575608 +0000 UTC m=+146.355268607 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.375990 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.376158 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xc8t7" podStartSLOduration=125.376137171 podStartE2EDuration="2m5.376137171s" podCreationTimestamp="2026-02-17 00:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:08.373099663 +0000 UTC m=+145.852612190" watchObservedRunningTime="2026-02-17 00:08:08.376137171 +0000 UTC m=+145.855649698" Feb 17 00:08:08 crc kubenswrapper[4791]: E0217 00:08:08.376310 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:08.876301876 +0000 UTC m=+146.355814403 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.478577 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:08 crc kubenswrapper[4791]: E0217 00:08:08.478953 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:08.978937067 +0000 UTC m=+146.458449594 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.579787 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:08 crc kubenswrapper[4791]: E0217 00:08:08.580110 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:09.08008794 +0000 UTC m=+146.559600457 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.677663 4791 patch_prober.go:28] interesting pod/router-default-5444994796-kt8q6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 00:08:08 crc kubenswrapper[4791]: [-]has-synced failed: reason withheld Feb 17 00:08:08 crc kubenswrapper[4791]: [+]process-running ok Feb 17 00:08:08 crc kubenswrapper[4791]: healthz check failed Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.677720 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kt8q6" podUID="459f3992-b770-44d7-9ecc-0ae8a228134f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.681080 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:08 crc kubenswrapper[4791]: E0217 00:08:08.681216 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:09.18119715 +0000 UTC m=+146.660709677 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.681410 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:08 crc kubenswrapper[4791]: E0217 00:08:08.681752 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:09.181741827 +0000 UTC m=+146.661254354 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.757133 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.782226 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:08 crc kubenswrapper[4791]: E0217 00:08:08.782604 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:09.28259028 +0000 UTC m=+146.762102807 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.884328 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:08 crc kubenswrapper[4791]: E0217 00:08:08.887491 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:09.387477553 +0000 UTC m=+146.866990080 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.989032 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:08 crc kubenswrapper[4791]: E0217 00:08:08.989846 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:09.489569225 +0000 UTC m=+146.969081752 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.990245 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:08 crc kubenswrapper[4791]: E0217 00:08:08.990585 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:09.490574107 +0000 UTC m=+146.970086634 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.004543 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-flvjk" event={"ID":"1b1913d4-85d3-4596-acea-6e272cf81e8e","Type":"ContainerStarted","Data":"beb21327d747d81874b1f2bc384ac019160f075efa5e3693137edea5df10aad0"} Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.013712 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j7w5g" event={"ID":"afb516ca-988f-4b77-aea0-10cd22ce2b77","Type":"ContainerStarted","Data":"95225305956cc307b1fd95289e86fe96f4a3d50e669118b9c57071737580a1fe"} Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.017966 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n2s28" event={"ID":"fe44c059-87ef-4805-b78f-b8c3cdfd844e","Type":"ContainerStarted","Data":"ad14918ea54f5f65c676b3347afb288620684183969cb994bb4b1b542057f1a3"} Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.021127 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" event={"ID":"811e9e22-1241-440a-9a6a-a6c51a0f0f7c","Type":"ContainerStarted","Data":"b062dbdf343b3684687d313c0a8e3c0a2c50ee9846d661195465b4147adac306"} Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.023848 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxk8k" event={"ID":"ad62ba3d-c60a-4e1f-9768-187e74151f24","Type":"ContainerStarted","Data":"6581a449ea38bab0e4535ff76ee15c3ea69bca38a74bdbd7c31b9de1ff9d756e"} Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.025594 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-72t6m" event={"ID":"ee8894f1-34ac-4df2-bf1a-01e1a110d6c9","Type":"ContainerStarted","Data":"c1fc365db21355adc10ca4434fc6bfa3c4562b2be333b5195506d687a0ab5b98"} Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.029220 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-7pqm8" event={"ID":"c50f96b4-3a86-4edc-b9d5-82fe3181b8a4","Type":"ContainerStarted","Data":"ce2dff7d47e9dd38a5436e594237da01da95c1f17a299dc45787ca15d700b4d2"} Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.031656 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vb8fv" event={"ID":"3baa46c8-1f0b-4b6f-95bb-94368bf6cc23","Type":"ContainerStarted","Data":"64685b3eaf469ada0cd4ed21bf6b0dd74fa2905ced4ac43eccb0f2d80dce5b32"} Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.034956 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4chtt" event={"ID":"eeffaf81-97bf-4570-b2f4-4692c4bda9ac","Type":"ContainerStarted","Data":"c70e43d973b6126077673278b53e1b32835f0991ddb2d79c1c137fe06938c45b"} Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.035082 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4chtt" event={"ID":"eeffaf81-97bf-4570-b2f4-4692c4bda9ac","Type":"ContainerStarted","Data":"da29cdcbc1a24aabb2bd2c21956f85a3dc79ca23b54b9abc3e41aeff45f5b102"} Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.037499 4791 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-bfffb container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.43:8080/healthz\": dial tcp 10.217.0.43:8080: connect: connection refused" start-of-body= Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.037538 4791 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-bfffb" podUID="13a5be44-f180-42a9-bff7-8ba69cc589f0" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.43:8080/healthz\": dial tcp 10.217.0.43:8080: connect: connection refused" Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.056946 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhmqq" Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.059530 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.059684 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.060261 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hb9dg" Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.061090 4791 patch_prober.go:28] interesting pod/apiserver-76f77b778f-flvjk container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.8:8443/livez\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.061157 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-flvjk" podUID="1b1913d4-85d3-4596-acea-6e272cf81e8e" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.8:8443/livez\": dial tcp 10.217.0.8:8443: connect: connection refused" Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.091651 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:09 crc kubenswrapper[4791]: E0217 00:08:09.091826 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:09.591799122 +0000 UTC m=+147.071311649 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.093618 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:09 crc kubenswrapper[4791]: E0217 00:08:09.095298 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:09.595286464 +0000 UTC m=+147.074798991 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.099540 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-flvjk" podStartSLOduration=127.09952487 podStartE2EDuration="2m7.09952487s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:09.098643851 +0000 UTC m=+146.578156378" watchObservedRunningTime="2026-02-17 00:08:09.09952487 +0000 UTC m=+146.579037397" Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.195136 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:09 crc kubenswrapper[4791]: E0217 00:08:09.196628 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:09.696613351 +0000 UTC m=+147.176125878 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.299853 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:09 crc kubenswrapper[4791]: E0217 00:08:09.300354 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:09.800341907 +0000 UTC m=+147.279854434 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.351316 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-72t6m" podStartSLOduration=127.351295765 podStartE2EDuration="2m7.351295765s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:09.350309753 +0000 UTC m=+146.829822280" watchObservedRunningTime="2026-02-17 00:08:09.351295765 +0000 UTC m=+146.830808292" Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.351756 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j7w5g" podStartSLOduration=127.35175279 podStartE2EDuration="2m7.35175279s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:09.307756136 +0000 UTC m=+146.787268663" watchObservedRunningTime="2026-02-17 00:08:09.35175279 +0000 UTC m=+146.831265317" Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.399212 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.399531 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.401745 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:09 crc kubenswrapper[4791]: E0217 00:08:09.402070 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:09.902053957 +0000 UTC m=+147.381566484 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.451565 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-4chtt" podStartSLOduration=8.451548918 podStartE2EDuration="8.451548918s" podCreationTimestamp="2026-02-17 00:08:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:09.450668521 +0000 UTC m=+146.930181048" watchObservedRunningTime="2026-02-17 00:08:09.451548918 +0000 UTC m=+146.931061445" Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.486228 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-7pqm8" podStartSLOduration=126.486205773 podStartE2EDuration="2m6.486205773s" podCreationTimestamp="2026-02-17 00:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:09.485424387 +0000 UTC m=+146.964936914" watchObservedRunningTime="2026-02-17 00:08:09.486205773 +0000 UTC m=+146.965718310" Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.492156 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bt9mf" Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.504722 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:09 crc kubenswrapper[4791]: E0217 00:08:09.505078 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:10.005062169 +0000 UTC m=+147.484574696 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.605640 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:09 crc kubenswrapper[4791]: E0217 00:08:09.606192 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:10.106155879 +0000 UTC m=+147.585668406 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.684759 4791 patch_prober.go:28] interesting pod/router-default-5444994796-kt8q6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 00:08:09 crc kubenswrapper[4791]: [-]has-synced failed: reason withheld Feb 17 00:08:09 crc kubenswrapper[4791]: [+]process-running ok Feb 17 00:08:09 crc kubenswrapper[4791]: healthz check failed Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.685091 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kt8q6" podUID="459f3992-b770-44d7-9ecc-0ae8a228134f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.713455 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:09 crc kubenswrapper[4791]: E0217 00:08:09.714742 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:10.21472014 +0000 UTC m=+147.694232667 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.815584 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:09 crc kubenswrapper[4791]: E0217 00:08:09.816020 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:10.316006037 +0000 UTC m=+147.795518564 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.865153 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.916738 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:09 crc kubenswrapper[4791]: E0217 00:08:09.917090 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:10.417074346 +0000 UTC m=+147.896586873 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.025558 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:10 crc kubenswrapper[4791]: E0217 00:08:10.025774 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:10.52574606 +0000 UTC m=+148.005258587 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.025860 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:10 crc kubenswrapper[4791]: E0217 00:08:10.026402 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:10.526391021 +0000 UTC m=+148.005903548 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.035709 4791 patch_prober.go:28] interesting pod/console-operator-58897d9998-t5827 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.036000 4791 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-t5827" podUID="4360bf41-9e45-498e-8f94-2c43a0dc88e5" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.043749 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" event={"ID":"811e9e22-1241-440a-9a6a-a6c51a0f0f7c","Type":"ContainerStarted","Data":"dba772c5c9ab5f5299b58fe79526e98940d7ac35727f37b1ab8d9d9a2e04639f"} Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.043791 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" event={"ID":"811e9e22-1241-440a-9a6a-a6c51a0f0f7c","Type":"ContainerStarted","Data":"9afeea82085ae371d14fbfa4430b59a5455b1127d46ebbf26203c0321439b678"} Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.044414 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-4chtt" Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.048914 4791 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-bfffb container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.43:8080/healthz\": dial tcp 10.217.0.43:8080: connect: connection refused" start-of-body= Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.048988 4791 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-bfffb" podUID="13a5be44-f180-42a9-bff7-8ba69cc589f0" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.43:8080/healthz\": dial tcp 10.217.0.43:8080: connect: connection refused" Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.053241 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.126882 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:10 crc kubenswrapper[4791]: E0217 00:08:10.127067 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:10.627034927 +0000 UTC m=+148.106547454 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.127862 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:10 crc kubenswrapper[4791]: E0217 00:08:10.129782 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:10.629764525 +0000 UTC m=+148.109277122 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.229261 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:10 crc kubenswrapper[4791]: E0217 00:08:10.229476 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:10.729447709 +0000 UTC m=+148.208960236 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.229526 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:10 crc kubenswrapper[4791]: E0217 00:08:10.229884 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:10.729869743 +0000 UTC m=+148.209382270 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.330974 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:10 crc kubenswrapper[4791]: E0217 00:08:10.331180 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:10.83115408 +0000 UTC m=+148.310666607 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.331497 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:10 crc kubenswrapper[4791]: E0217 00:08:10.331829 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:10.831821551 +0000 UTC m=+148.311334069 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.432699 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:10 crc kubenswrapper[4791]: E0217 00:08:10.432905 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:10.932876831 +0000 UTC m=+148.412389358 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.433127 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:10 crc kubenswrapper[4791]: E0217 00:08:10.433462 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:10.933449549 +0000 UTC m=+148.412962076 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.446059 4791 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.534651 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:10 crc kubenswrapper[4791]: E0217 00:08:10.534834 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:11.034802217 +0000 UTC m=+148.514314744 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.535096 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:10 crc kubenswrapper[4791]: E0217 00:08:10.535425 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:11.035408987 +0000 UTC m=+148.514921514 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.538201 4791 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-17T00:08:10.446095936Z","Handler":null,"Name":""} Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.544540 4791 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.544568 4791 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.636443 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.640902 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.670576 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxk8k" Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.682377 4791 patch_prober.go:28] interesting pod/router-default-5444994796-kt8q6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 00:08:10 crc kubenswrapper[4791]: [-]has-synced failed: reason withheld Feb 17 00:08:10 crc kubenswrapper[4791]: [+]process-running ok Feb 17 00:08:10 crc kubenswrapper[4791]: healthz check failed Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.682437 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kt8q6" podUID="459f3992-b770-44d7-9ecc-0ae8a228134f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.738120 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.755343 4791 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.755380 4791 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.871820 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cgmd4"] Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.872740 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cgmd4" Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.876364 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.915406 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cgmd4"] Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.918363 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.027928 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8xbcp"] Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.029100 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8xbcp" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.030528 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.040674 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48855520-658c-4579-a867-7e984bce56c7-utilities\") pod \"certified-operators-cgmd4\" (UID: \"48855520-658c-4579-a867-7e984bce56c7\") " pod="openshift-marketplace/certified-operators-cgmd4" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.040776 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48855520-658c-4579-a867-7e984bce56c7-catalog-content\") pod \"certified-operators-cgmd4\" (UID: \"48855520-658c-4579-a867-7e984bce56c7\") " pod="openshift-marketplace/certified-operators-cgmd4" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.040820 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq9sd\" (UniqueName: \"kubernetes.io/projected/48855520-658c-4579-a867-7e984bce56c7-kube-api-access-rq9sd\") pod \"certified-operators-cgmd4\" (UID: \"48855520-658c-4579-a867-7e984bce56c7\") " pod="openshift-marketplace/certified-operators-cgmd4" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.068041 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" event={"ID":"811e9e22-1241-440a-9a6a-a6c51a0f0f7c","Type":"ContainerStarted","Data":"a74dd55763b606bc9278adcb53618735d3d9f706a1b407bef735ee336f188514"} Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.093584 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8xbcp"] Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.096997 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.142790 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48855520-658c-4579-a867-7e984bce56c7-catalog-content\") pod \"certified-operators-cgmd4\" (UID: \"48855520-658c-4579-a867-7e984bce56c7\") " pod="openshift-marketplace/certified-operators-cgmd4" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.142883 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blntw\" (UniqueName: \"kubernetes.io/projected/f04d6e19-5c11-4527-8a49-3208098d2575-kube-api-access-blntw\") pod \"community-operators-8xbcp\" (UID: \"f04d6e19-5c11-4527-8a49-3208098d2575\") " pod="openshift-marketplace/community-operators-8xbcp" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.142904 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rq9sd\" (UniqueName: \"kubernetes.io/projected/48855520-658c-4579-a867-7e984bce56c7-kube-api-access-rq9sd\") pod \"certified-operators-cgmd4\" (UID: \"48855520-658c-4579-a867-7e984bce56c7\") " pod="openshift-marketplace/certified-operators-cgmd4" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.142937 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.142991 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f04d6e19-5c11-4527-8a49-3208098d2575-catalog-content\") pod \"community-operators-8xbcp\" (UID: \"f04d6e19-5c11-4527-8a49-3208098d2575\") " pod="openshift-marketplace/community-operators-8xbcp" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.143092 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f04d6e19-5c11-4527-8a49-3208098d2575-utilities\") pod \"community-operators-8xbcp\" (UID: \"f04d6e19-5c11-4527-8a49-3208098d2575\") " pod="openshift-marketplace/community-operators-8xbcp" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.143140 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.143198 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48855520-658c-4579-a867-7e984bce56c7-utilities\") pod \"certified-operators-cgmd4\" (UID: \"48855520-658c-4579-a867-7e984bce56c7\") " pod="openshift-marketplace/certified-operators-cgmd4" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.143253 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.143277 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.144286 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48855520-658c-4579-a867-7e984bce56c7-catalog-content\") pod \"certified-operators-cgmd4\" (UID: \"48855520-658c-4579-a867-7e984bce56c7\") " pod="openshift-marketplace/certified-operators-cgmd4" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.147955 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48855520-658c-4579-a867-7e984bce56c7-utilities\") pod \"certified-operators-cgmd4\" (UID: \"48855520-658c-4579-a867-7e984bce56c7\") " pod="openshift-marketplace/certified-operators-cgmd4" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.150099 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.151505 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.152451 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.161826 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.168373 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.189442 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.189969 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq9sd\" (UniqueName: \"kubernetes.io/projected/48855520-658c-4579-a867-7e984bce56c7-kube-api-access-rq9sd\") pod \"certified-operators-cgmd4\" (UID: \"48855520-658c-4579-a867-7e984bce56c7\") " pod="openshift-marketplace/certified-operators-cgmd4" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.190013 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.208612 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cgmd4" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.231170 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" podStartSLOduration=10.231149427 podStartE2EDuration="10.231149427s" podCreationTimestamp="2026-02-17 00:08:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:11.122430391 +0000 UTC m=+148.601942908" watchObservedRunningTime="2026-02-17 00:08:11.231149427 +0000 UTC m=+148.710661954" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.236494 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.237020 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tt7zf"] Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.237955 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tt7zf" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.244007 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f04d6e19-5c11-4527-8a49-3208098d2575-catalog-content\") pod \"community-operators-8xbcp\" (UID: \"f04d6e19-5c11-4527-8a49-3208098d2575\") " pod="openshift-marketplace/community-operators-8xbcp" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.244105 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f04d6e19-5c11-4527-8a49-3208098d2575-utilities\") pod \"community-operators-8xbcp\" (UID: \"f04d6e19-5c11-4527-8a49-3208098d2575\") " pod="openshift-marketplace/community-operators-8xbcp" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.244214 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blntw\" (UniqueName: \"kubernetes.io/projected/f04d6e19-5c11-4527-8a49-3208098d2575-kube-api-access-blntw\") pod \"community-operators-8xbcp\" (UID: \"f04d6e19-5c11-4527-8a49-3208098d2575\") " pod="openshift-marketplace/community-operators-8xbcp" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.245211 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tt7zf"] Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.245476 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f04d6e19-5c11-4527-8a49-3208098d2575-utilities\") pod \"community-operators-8xbcp\" (UID: \"f04d6e19-5c11-4527-8a49-3208098d2575\") " pod="openshift-marketplace/community-operators-8xbcp" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.245611 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f04d6e19-5c11-4527-8a49-3208098d2575-catalog-content\") pod \"community-operators-8xbcp\" (UID: \"f04d6e19-5c11-4527-8a49-3208098d2575\") " pod="openshift-marketplace/community-operators-8xbcp" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.269014 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blntw\" (UniqueName: \"kubernetes.io/projected/f04d6e19-5c11-4527-8a49-3208098d2575-kube-api-access-blntw\") pod \"community-operators-8xbcp\" (UID: \"f04d6e19-5c11-4527-8a49-3208098d2575\") " pod="openshift-marketplace/community-operators-8xbcp" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.345232 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9895217e-9934-4f80-a583-98842d597690-catalog-content\") pod \"certified-operators-tt7zf\" (UID: \"9895217e-9934-4f80-a583-98842d597690\") " pod="openshift-marketplace/certified-operators-tt7zf" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.345477 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9895217e-9934-4f80-a583-98842d597690-utilities\") pod \"certified-operators-tt7zf\" (UID: \"9895217e-9934-4f80-a583-98842d597690\") " pod="openshift-marketplace/certified-operators-tt7zf" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.345511 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndp5p\" (UniqueName: \"kubernetes.io/projected/9895217e-9934-4f80-a583-98842d597690-kube-api-access-ndp5p\") pod \"certified-operators-tt7zf\" (UID: \"9895217e-9934-4f80-a583-98842d597690\") " pod="openshift-marketplace/certified-operators-tt7zf" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.346435 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8xbcp" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.439434 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b9dcp"] Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.440434 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b9dcp" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.446653 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9895217e-9934-4f80-a583-98842d597690-catalog-content\") pod \"certified-operators-tt7zf\" (UID: \"9895217e-9934-4f80-a583-98842d597690\") " pod="openshift-marketplace/certified-operators-tt7zf" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.446698 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9895217e-9934-4f80-a583-98842d597690-utilities\") pod \"certified-operators-tt7zf\" (UID: \"9895217e-9934-4f80-a583-98842d597690\") " pod="openshift-marketplace/certified-operators-tt7zf" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.446726 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndp5p\" (UniqueName: \"kubernetes.io/projected/9895217e-9934-4f80-a583-98842d597690-kube-api-access-ndp5p\") pod \"certified-operators-tt7zf\" (UID: \"9895217e-9934-4f80-a583-98842d597690\") " pod="openshift-marketplace/certified-operators-tt7zf" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.447413 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9895217e-9934-4f80-a583-98842d597690-catalog-content\") pod \"certified-operators-tt7zf\" (UID: \"9895217e-9934-4f80-a583-98842d597690\") " pod="openshift-marketplace/certified-operators-tt7zf" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.453019 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9895217e-9934-4f80-a583-98842d597690-utilities\") pod \"certified-operators-tt7zf\" (UID: \"9895217e-9934-4f80-a583-98842d597690\") " pod="openshift-marketplace/certified-operators-tt7zf" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.458665 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b9dcp"] Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.491000 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndp5p\" (UniqueName: \"kubernetes.io/projected/9895217e-9934-4f80-a583-98842d597690-kube-api-access-ndp5p\") pod \"certified-operators-tt7zf\" (UID: \"9895217e-9934-4f80-a583-98842d597690\") " pod="openshift-marketplace/certified-operators-tt7zf" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.547890 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6fd4\" (UniqueName: \"kubernetes.io/projected/9e98ac61-7140-4c15-8c29-47676734a52d-kube-api-access-f6fd4\") pod \"community-operators-b9dcp\" (UID: \"9e98ac61-7140-4c15-8c29-47676734a52d\") " pod="openshift-marketplace/community-operators-b9dcp" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.547986 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e98ac61-7140-4c15-8c29-47676734a52d-utilities\") pod \"community-operators-b9dcp\" (UID: \"9e98ac61-7140-4c15-8c29-47676734a52d\") " pod="openshift-marketplace/community-operators-b9dcp" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.548042 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e98ac61-7140-4c15-8c29-47676734a52d-catalog-content\") pod \"community-operators-b9dcp\" (UID: \"9e98ac61-7140-4c15-8c29-47676734a52d\") " pod="openshift-marketplace/community-operators-b9dcp" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.553257 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tt7zf" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.652549 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6fd4\" (UniqueName: \"kubernetes.io/projected/9e98ac61-7140-4c15-8c29-47676734a52d-kube-api-access-f6fd4\") pod \"community-operators-b9dcp\" (UID: \"9e98ac61-7140-4c15-8c29-47676734a52d\") " pod="openshift-marketplace/community-operators-b9dcp" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.652614 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e98ac61-7140-4c15-8c29-47676734a52d-utilities\") pod \"community-operators-b9dcp\" (UID: \"9e98ac61-7140-4c15-8c29-47676734a52d\") " pod="openshift-marketplace/community-operators-b9dcp" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.652678 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e98ac61-7140-4c15-8c29-47676734a52d-catalog-content\") pod \"community-operators-b9dcp\" (UID: \"9e98ac61-7140-4c15-8c29-47676734a52d\") " pod="openshift-marketplace/community-operators-b9dcp" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.653262 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e98ac61-7140-4c15-8c29-47676734a52d-utilities\") pod \"community-operators-b9dcp\" (UID: \"9e98ac61-7140-4c15-8c29-47676734a52d\") " pod="openshift-marketplace/community-operators-b9dcp" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.653504 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e98ac61-7140-4c15-8c29-47676734a52d-catalog-content\") pod \"community-operators-b9dcp\" (UID: \"9e98ac61-7140-4c15-8c29-47676734a52d\") " pod="openshift-marketplace/community-operators-b9dcp" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.673743 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6fd4\" (UniqueName: \"kubernetes.io/projected/9e98ac61-7140-4c15-8c29-47676734a52d-kube-api-access-f6fd4\") pod \"community-operators-b9dcp\" (UID: \"9e98ac61-7140-4c15-8c29-47676734a52d\") " pod="openshift-marketplace/community-operators-b9dcp" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.676772 4791 patch_prober.go:28] interesting pod/router-default-5444994796-kt8q6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 00:08:11 crc kubenswrapper[4791]: [-]has-synced failed: reason withheld Feb 17 00:08:11 crc kubenswrapper[4791]: [+]process-running ok Feb 17 00:08:11 crc kubenswrapper[4791]: healthz check failed Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.676848 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kt8q6" podUID="459f3992-b770-44d7-9ecc-0ae8a228134f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.777952 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8xbcp"] Feb 17 00:08:11 crc kubenswrapper[4791]: W0217 00:08:11.784763 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf04d6e19_5c11_4527_8a49_3208098d2575.slice/crio-8181b3c79ba3d257e580e3f1df6f57468b32e1bd3945a81c8cb4156646ea066f WatchSource:0}: Error finding container 8181b3c79ba3d257e580e3f1df6f57468b32e1bd3945a81c8cb4156646ea066f: Status 404 returned error can't find the container with id 8181b3c79ba3d257e580e3f1df6f57468b32e1bd3945a81c8cb4156646ea066f Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.811327 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b9dcp" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.819910 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wfpf2"] Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.825489 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tt7zf"] Feb 17 00:08:11 crc kubenswrapper[4791]: W0217 00:08:11.834017 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc33165ce_519a_4b0e_b62a_f153d38fc14c.slice/crio-a7ca3d122288b33fa8a847c41ca82278c8736040a34df9221628fbf85c038b55 WatchSource:0}: Error finding container a7ca3d122288b33fa8a847c41ca82278c8736040a34df9221628fbf85c038b55: Status 404 returned error can't find the container with id a7ca3d122288b33fa8a847c41ca82278c8736040a34df9221628fbf85c038b55 Feb 17 00:08:11 crc kubenswrapper[4791]: W0217 00:08:11.835365 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9895217e_9934_4f80_a583_98842d597690.slice/crio-977ab4aa20ed7b60a4814c6d6489439a907a290d464af852a9ad4af4039d6953 WatchSource:0}: Error finding container 977ab4aa20ed7b60a4814c6d6489439a907a290d464af852a9ad4af4039d6953: Status 404 returned error can't find the container with id 977ab4aa20ed7b60a4814c6d6489439a907a290d464af852a9ad4af4039d6953 Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.000698 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cgmd4"] Feb 17 00:08:12 crc kubenswrapper[4791]: W0217 00:08:12.002039 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-9d33d0a59b1363e6ccc78242f7e8cb6ff6a5b0aa639f99a163c5c4caa48c1335 WatchSource:0}: Error finding container 9d33d0a59b1363e6ccc78242f7e8cb6ff6a5b0aa639f99a163c5c4caa48c1335: Status 404 returned error can't find the container with id 9d33d0a59b1363e6ccc78242f7e8cb6ff6a5b0aa639f99a163c5c4caa48c1335 Feb 17 00:08:12 crc kubenswrapper[4791]: W0217 00:08:12.005639 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-1e35f43d7a85477c414761d38a0214bd527154fb9e5910306b2a5e5a9e67d613 WatchSource:0}: Error finding container 1e35f43d7a85477c414761d38a0214bd527154fb9e5910306b2a5e5a9e67d613: Status 404 returned error can't find the container with id 1e35f43d7a85477c414761d38a0214bd527154fb9e5910306b2a5e5a9e67d613 Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.081465 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"fe8fc49335ce9c112e635fb18ef33bc5b86c1cb1a438af4f84b3fc61e068b2b3"} Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.081538 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"153e79a3bc4b82199aa5f26d5fae710b75335a093fc90ef0af29f47e3107c3f0"} Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.081970 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.109400 4791 generic.go:334] "Generic (PLEG): container finished" podID="4dde99c9-4e15-4bdc-ba17-5d00e4117b1c" containerID="891df08438b436b0b07d8427461e11fc0c9eb7e638150952cf235c2137b716b6" exitCode=0 Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.109498 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521440-crl2x" event={"ID":"4dde99c9-4e15-4bdc-ba17-5d00e4117b1c","Type":"ContainerDied","Data":"891df08438b436b0b07d8427461e11fc0c9eb7e638150952cf235c2137b716b6"} Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.112122 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cgmd4" event={"ID":"48855520-658c-4579-a867-7e984bce56c7","Type":"ContainerStarted","Data":"f3a8bf4a4e4255984cba5a86035a408c84d7e84e14a3acd43f2d8aaf7ecd5cee"} Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.115258 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tt7zf" event={"ID":"9895217e-9934-4f80-a583-98842d597690","Type":"ContainerStarted","Data":"977ab4aa20ed7b60a4814c6d6489439a907a290d464af852a9ad4af4039d6953"} Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.117172 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" event={"ID":"c33165ce-519a-4b0e-b62a-f153d38fc14c","Type":"ContainerStarted","Data":"a7ca3d122288b33fa8a847c41ca82278c8736040a34df9221628fbf85c038b55"} Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.118534 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"9d33d0a59b1363e6ccc78242f7e8cb6ff6a5b0aa639f99a163c5c4caa48c1335"} Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.121141 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8xbcp" event={"ID":"f04d6e19-5c11-4527-8a49-3208098d2575","Type":"ContainerStarted","Data":"18e3187ecd2ce470123e0d8e986a332f912fb582355e7a34e0e65fa2416eee8c"} Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.121161 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8xbcp" event={"ID":"f04d6e19-5c11-4527-8a49-3208098d2575","Type":"ContainerStarted","Data":"8181b3c79ba3d257e580e3f1df6f57468b32e1bd3945a81c8cb4156646ea066f"} Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.121865 4791 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.123699 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"1e35f43d7a85477c414761d38a0214bd527154fb9e5910306b2a5e5a9e67d613"} Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.302697 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b9dcp"] Feb 17 00:08:12 crc kubenswrapper[4791]: W0217 00:08:12.354623 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e98ac61_7140_4c15_8c29_47676734a52d.slice/crio-b50d297ceb2781d37cf93aa6962f7a2c6eaee4f855e23a5380418bef8dad3d16 WatchSource:0}: Error finding container b50d297ceb2781d37cf93aa6962f7a2c6eaee4f855e23a5380418bef8dad3d16: Status 404 returned error can't find the container with id b50d297ceb2781d37cf93aa6962f7a2c6eaee4f855e23a5380418bef8dad3d16 Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.432520 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.433439 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.436065 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.436611 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.444265 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.566183 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8b169853-1972-4cb9-9a80-159b0b3456fa-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8b169853-1972-4cb9-9a80-159b0b3456fa\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.566289 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b169853-1972-4cb9-9a80-159b0b3456fa-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8b169853-1972-4cb9-9a80-159b0b3456fa\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.667891 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8b169853-1972-4cb9-9a80-159b0b3456fa-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8b169853-1972-4cb9-9a80-159b0b3456fa\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.667982 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b169853-1972-4cb9-9a80-159b0b3456fa-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8b169853-1972-4cb9-9a80-159b0b3456fa\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.668100 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8b169853-1972-4cb9-9a80-159b0b3456fa-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8b169853-1972-4cb9-9a80-159b0b3456fa\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.676633 4791 patch_prober.go:28] interesting pod/router-default-5444994796-kt8q6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 00:08:12 crc kubenswrapper[4791]: [-]has-synced failed: reason withheld Feb 17 00:08:12 crc kubenswrapper[4791]: [+]process-running ok Feb 17 00:08:12 crc kubenswrapper[4791]: healthz check failed Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.676972 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kt8q6" podUID="459f3992-b770-44d7-9ecc-0ae8a228134f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.691709 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b169853-1972-4cb9-9a80-159b0b3456fa-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8b169853-1972-4cb9-9a80-159b0b3456fa\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.745926 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.031180 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-h66xr"] Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.032424 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h66xr" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.041399 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.056802 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h66xr"] Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.139402 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.149782 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"342c0d8e9fe915056a180d479cef043362d410ea2642f6da8cd8cee34bd4460c"} Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.171749 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" event={"ID":"c33165ce-519a-4b0e-b62a-f153d38fc14c","Type":"ContainerStarted","Data":"13523832c1775f98b7ffe1d732a20c6ea35372708ee330562d595940dbcc07f3"} Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.172338 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.176080 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2txwc\" (UniqueName: \"kubernetes.io/projected/db1caaaf-7e8b-405c-97ff-7c507f068688-kube-api-access-2txwc\") pod \"redhat-marketplace-h66xr\" (UID: \"db1caaaf-7e8b-405c-97ff-7c507f068688\") " pod="openshift-marketplace/redhat-marketplace-h66xr" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.176152 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db1caaaf-7e8b-405c-97ff-7c507f068688-catalog-content\") pod \"redhat-marketplace-h66xr\" (UID: \"db1caaaf-7e8b-405c-97ff-7c507f068688\") " pod="openshift-marketplace/redhat-marketplace-h66xr" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.176217 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db1caaaf-7e8b-405c-97ff-7c507f068688-utilities\") pod \"redhat-marketplace-h66xr\" (UID: \"db1caaaf-7e8b-405c-97ff-7c507f068688\") " pod="openshift-marketplace/redhat-marketplace-h66xr" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.184380 4791 generic.go:334] "Generic (PLEG): container finished" podID="f04d6e19-5c11-4527-8a49-3208098d2575" containerID="18e3187ecd2ce470123e0d8e986a332f912fb582355e7a34e0e65fa2416eee8c" exitCode=0 Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.184505 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8xbcp" event={"ID":"f04d6e19-5c11-4527-8a49-3208098d2575","Type":"ContainerDied","Data":"18e3187ecd2ce470123e0d8e986a332f912fb582355e7a34e0e65fa2416eee8c"} Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.188220 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"62f4c03ce4a00b767928e580f0d8ab3b5b984df57bef049c06764177ce55597c"} Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.190030 4791 generic.go:334] "Generic (PLEG): container finished" podID="9e98ac61-7140-4c15-8c29-47676734a52d" containerID="d006ea87acb43994528a5f501fff92ed303a832231395907903d1e69a1ba44f7" exitCode=0 Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.190131 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9dcp" event={"ID":"9e98ac61-7140-4c15-8c29-47676734a52d","Type":"ContainerDied","Data":"d006ea87acb43994528a5f501fff92ed303a832231395907903d1e69a1ba44f7"} Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.190168 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9dcp" event={"ID":"9e98ac61-7140-4c15-8c29-47676734a52d","Type":"ContainerStarted","Data":"b50d297ceb2781d37cf93aa6962f7a2c6eaee4f855e23a5380418bef8dad3d16"} Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.194786 4791 generic.go:334] "Generic (PLEG): container finished" podID="48855520-658c-4579-a867-7e984bce56c7" containerID="bbbf553ba1a4a0c0e10d191ff91b9ffb504aa8c57407304faeb4477b07b27e82" exitCode=0 Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.194845 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cgmd4" event={"ID":"48855520-658c-4579-a867-7e984bce56c7","Type":"ContainerDied","Data":"bbbf553ba1a4a0c0e10d191ff91b9ffb504aa8c57407304faeb4477b07b27e82"} Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.211053 4791 generic.go:334] "Generic (PLEG): container finished" podID="9895217e-9934-4f80-a583-98842d597690" containerID="b36899b8f5eaec69b28a571b963ca26fa0d5f68899482c023912c3245da10e33" exitCode=0 Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.212424 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tt7zf" event={"ID":"9895217e-9934-4f80-a583-98842d597690","Type":"ContainerDied","Data":"b36899b8f5eaec69b28a571b963ca26fa0d5f68899482c023912c3245da10e33"} Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.217231 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" podStartSLOduration=131.217210873 podStartE2EDuration="2m11.217210873s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:13.210226389 +0000 UTC m=+150.689738916" watchObservedRunningTime="2026-02-17 00:08:13.217210873 +0000 UTC m=+150.696723400" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.279436 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2txwc\" (UniqueName: \"kubernetes.io/projected/db1caaaf-7e8b-405c-97ff-7c507f068688-kube-api-access-2txwc\") pod \"redhat-marketplace-h66xr\" (UID: \"db1caaaf-7e8b-405c-97ff-7c507f068688\") " pod="openshift-marketplace/redhat-marketplace-h66xr" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.279467 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db1caaaf-7e8b-405c-97ff-7c507f068688-catalog-content\") pod \"redhat-marketplace-h66xr\" (UID: \"db1caaaf-7e8b-405c-97ff-7c507f068688\") " pod="openshift-marketplace/redhat-marketplace-h66xr" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.279562 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db1caaaf-7e8b-405c-97ff-7c507f068688-utilities\") pod \"redhat-marketplace-h66xr\" (UID: \"db1caaaf-7e8b-405c-97ff-7c507f068688\") " pod="openshift-marketplace/redhat-marketplace-h66xr" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.281062 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db1caaaf-7e8b-405c-97ff-7c507f068688-catalog-content\") pod \"redhat-marketplace-h66xr\" (UID: \"db1caaaf-7e8b-405c-97ff-7c507f068688\") " pod="openshift-marketplace/redhat-marketplace-h66xr" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.281094 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db1caaaf-7e8b-405c-97ff-7c507f068688-utilities\") pod \"redhat-marketplace-h66xr\" (UID: \"db1caaaf-7e8b-405c-97ff-7c507f068688\") " pod="openshift-marketplace/redhat-marketplace-h66xr" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.305033 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2txwc\" (UniqueName: \"kubernetes.io/projected/db1caaaf-7e8b-405c-97ff-7c507f068688-kube-api-access-2txwc\") pod \"redhat-marketplace-h66xr\" (UID: \"db1caaaf-7e8b-405c-97ff-7c507f068688\") " pod="openshift-marketplace/redhat-marketplace-h66xr" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.355664 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h66xr" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.458474 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-twq6q"] Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.461498 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-twq6q" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.475042 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-twq6q"] Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.607970 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hwtv\" (UniqueName: \"kubernetes.io/projected/509fdefa-10b0-4752-b844-843ed9e7106d-kube-api-access-8hwtv\") pod \"redhat-marketplace-twq6q\" (UID: \"509fdefa-10b0-4752-b844-843ed9e7106d\") " pod="openshift-marketplace/redhat-marketplace-twq6q" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.608041 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/509fdefa-10b0-4752-b844-843ed9e7106d-utilities\") pod \"redhat-marketplace-twq6q\" (UID: \"509fdefa-10b0-4752-b844-843ed9e7106d\") " pod="openshift-marketplace/redhat-marketplace-twq6q" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.608079 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/509fdefa-10b0-4752-b844-843ed9e7106d-catalog-content\") pod \"redhat-marketplace-twq6q\" (UID: \"509fdefa-10b0-4752-b844-843ed9e7106d\") " pod="openshift-marketplace/redhat-marketplace-twq6q" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.611541 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521440-crl2x" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.660981 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h66xr"] Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.676704 4791 patch_prober.go:28] interesting pod/router-default-5444994796-kt8q6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 00:08:13 crc kubenswrapper[4791]: [-]has-synced failed: reason withheld Feb 17 00:08:13 crc kubenswrapper[4791]: [+]process-running ok Feb 17 00:08:13 crc kubenswrapper[4791]: healthz check failed Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.676989 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kt8q6" podUID="459f3992-b770-44d7-9ecc-0ae8a228134f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.709478 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4dde99c9-4e15-4bdc-ba17-5d00e4117b1c-config-volume\") pod \"4dde99c9-4e15-4bdc-ba17-5d00e4117b1c\" (UID: \"4dde99c9-4e15-4bdc-ba17-5d00e4117b1c\") " Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.709586 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4dde99c9-4e15-4bdc-ba17-5d00e4117b1c-secret-volume\") pod \"4dde99c9-4e15-4bdc-ba17-5d00e4117b1c\" (UID: \"4dde99c9-4e15-4bdc-ba17-5d00e4117b1c\") " Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.709634 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvk22\" (UniqueName: \"kubernetes.io/projected/4dde99c9-4e15-4bdc-ba17-5d00e4117b1c-kube-api-access-tvk22\") pod \"4dde99c9-4e15-4bdc-ba17-5d00e4117b1c\" (UID: \"4dde99c9-4e15-4bdc-ba17-5d00e4117b1c\") " Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.709767 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/509fdefa-10b0-4752-b844-843ed9e7106d-catalog-content\") pod \"redhat-marketplace-twq6q\" (UID: \"509fdefa-10b0-4752-b844-843ed9e7106d\") " pod="openshift-marketplace/redhat-marketplace-twq6q" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.709833 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hwtv\" (UniqueName: \"kubernetes.io/projected/509fdefa-10b0-4752-b844-843ed9e7106d-kube-api-access-8hwtv\") pod \"redhat-marketplace-twq6q\" (UID: \"509fdefa-10b0-4752-b844-843ed9e7106d\") " pod="openshift-marketplace/redhat-marketplace-twq6q" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.709876 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/509fdefa-10b0-4752-b844-843ed9e7106d-utilities\") pod \"redhat-marketplace-twq6q\" (UID: \"509fdefa-10b0-4752-b844-843ed9e7106d\") " pod="openshift-marketplace/redhat-marketplace-twq6q" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.710290 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/509fdefa-10b0-4752-b844-843ed9e7106d-utilities\") pod \"redhat-marketplace-twq6q\" (UID: \"509fdefa-10b0-4752-b844-843ed9e7106d\") " pod="openshift-marketplace/redhat-marketplace-twq6q" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.710778 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/509fdefa-10b0-4752-b844-843ed9e7106d-catalog-content\") pod \"redhat-marketplace-twq6q\" (UID: \"509fdefa-10b0-4752-b844-843ed9e7106d\") " pod="openshift-marketplace/redhat-marketplace-twq6q" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.710802 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dde99c9-4e15-4bdc-ba17-5d00e4117b1c-config-volume" (OuterVolumeSpecName: "config-volume") pod "4dde99c9-4e15-4bdc-ba17-5d00e4117b1c" (UID: "4dde99c9-4e15-4bdc-ba17-5d00e4117b1c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.716213 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dde99c9-4e15-4bdc-ba17-5d00e4117b1c-kube-api-access-tvk22" (OuterVolumeSpecName: "kube-api-access-tvk22") pod "4dde99c9-4e15-4bdc-ba17-5d00e4117b1c" (UID: "4dde99c9-4e15-4bdc-ba17-5d00e4117b1c"). InnerVolumeSpecName "kube-api-access-tvk22". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.720221 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dde99c9-4e15-4bdc-ba17-5d00e4117b1c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4dde99c9-4e15-4bdc-ba17-5d00e4117b1c" (UID: "4dde99c9-4e15-4bdc-ba17-5d00e4117b1c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.729429 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hwtv\" (UniqueName: \"kubernetes.io/projected/509fdefa-10b0-4752-b844-843ed9e7106d-kube-api-access-8hwtv\") pod \"redhat-marketplace-twq6q\" (UID: \"509fdefa-10b0-4752-b844-843ed9e7106d\") " pod="openshift-marketplace/redhat-marketplace-twq6q" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.811448 4791 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4dde99c9-4e15-4bdc-ba17-5d00e4117b1c-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.811490 4791 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4dde99c9-4e15-4bdc-ba17-5d00e4117b1c-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.811508 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvk22\" (UniqueName: \"kubernetes.io/projected/4dde99c9-4e15-4bdc-ba17-5d00e4117b1c-kube-api-access-tvk22\") on node \"crc\" DevicePath \"\"" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.847369 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-twq6q" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.024288 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-s76xp"] Feb 17 00:08:14 crc kubenswrapper[4791]: E0217 00:08:14.024822 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dde99c9-4e15-4bdc-ba17-5d00e4117b1c" containerName="collect-profiles" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.024839 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dde99c9-4e15-4bdc-ba17-5d00e4117b1c" containerName="collect-profiles" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.024953 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dde99c9-4e15-4bdc-ba17-5d00e4117b1c" containerName="collect-profiles" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.025825 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s76xp" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.031020 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s76xp"] Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.032926 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.066523 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.075076 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.160717 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.160757 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.162655 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h6gj\" (UniqueName: \"kubernetes.io/projected/6e6c03f6-847b-402c-bfde-6dd30870b907-kube-api-access-6h6gj\") pod \"redhat-operators-s76xp\" (UID: \"6e6c03f6-847b-402c-bfde-6dd30870b907\") " pod="openshift-marketplace/redhat-operators-s76xp" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.162731 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e6c03f6-847b-402c-bfde-6dd30870b907-utilities\") pod \"redhat-operators-s76xp\" (UID: \"6e6c03f6-847b-402c-bfde-6dd30870b907\") " pod="openshift-marketplace/redhat-operators-s76xp" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.162781 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e6c03f6-847b-402c-bfde-6dd30870b907-catalog-content\") pod \"redhat-operators-s76xp\" (UID: \"6e6c03f6-847b-402c-bfde-6dd30870b907\") " pod="openshift-marketplace/redhat-operators-s76xp" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.186359 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-t5827" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.190663 4791 patch_prober.go:28] interesting pod/console-f9d7485db-frmbv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.190705 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-frmbv" podUID="155619c1-12ba-4149-9dce-474e3735168c" containerName="console" probeResult="failure" output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.245000 4791 generic.go:334] "Generic (PLEG): container finished" podID="db1caaaf-7e8b-405c-97ff-7c507f068688" containerID="498836b2c2c73b292dcebe1de7c28ad97044adda878173f23061bacd92d69dfd" exitCode=0 Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.245062 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h66xr" event={"ID":"db1caaaf-7e8b-405c-97ff-7c507f068688","Type":"ContainerDied","Data":"498836b2c2c73b292dcebe1de7c28ad97044adda878173f23061bacd92d69dfd"} Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.245088 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h66xr" event={"ID":"db1caaaf-7e8b-405c-97ff-7c507f068688","Type":"ContainerStarted","Data":"0a2777b10322faf11f31316ab253e0d88d658e157f8af01279ebdea911a277fa"} Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.255510 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8b169853-1972-4cb9-9a80-159b0b3456fa","Type":"ContainerStarted","Data":"c3c101c94bb03448c1b4eb251a0fab2366de8d1d9d41e50fd26d9559d10d5bd0"} Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.255549 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8b169853-1972-4cb9-9a80-159b0b3456fa","Type":"ContainerStarted","Data":"6dac1b8347eda9b35ac38349a12edc6d86d8bfa6b545b17f2fe10256ba56df85"} Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.265721 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6h6gj\" (UniqueName: \"kubernetes.io/projected/6e6c03f6-847b-402c-bfde-6dd30870b907-kube-api-access-6h6gj\") pod \"redhat-operators-s76xp\" (UID: \"6e6c03f6-847b-402c-bfde-6dd30870b907\") " pod="openshift-marketplace/redhat-operators-s76xp" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.265867 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e6c03f6-847b-402c-bfde-6dd30870b907-utilities\") pod \"redhat-operators-s76xp\" (UID: \"6e6c03f6-847b-402c-bfde-6dd30870b907\") " pod="openshift-marketplace/redhat-operators-s76xp" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.265944 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e6c03f6-847b-402c-bfde-6dd30870b907-catalog-content\") pod \"redhat-operators-s76xp\" (UID: \"6e6c03f6-847b-402c-bfde-6dd30870b907\") " pod="openshift-marketplace/redhat-operators-s76xp" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.266511 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e6c03f6-847b-402c-bfde-6dd30870b907-catalog-content\") pod \"redhat-operators-s76xp\" (UID: \"6e6c03f6-847b-402c-bfde-6dd30870b907\") " pod="openshift-marketplace/redhat-operators-s76xp" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.267303 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e6c03f6-847b-402c-bfde-6dd30870b907-utilities\") pod \"redhat-operators-s76xp\" (UID: \"6e6c03f6-847b-402c-bfde-6dd30870b907\") " pod="openshift-marketplace/redhat-operators-s76xp" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.276382 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521440-crl2x" event={"ID":"4dde99c9-4e15-4bdc-ba17-5d00e4117b1c","Type":"ContainerDied","Data":"a39ba9373a1c8330d92d717ecb292520ee487332b22cd14b0e5fe57ebfb54ebe"} Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.276422 4791 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a39ba9373a1c8330d92d717ecb292520ee487332b22cd14b0e5fe57ebfb54ebe" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.277071 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521440-crl2x" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.292315 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h6gj\" (UniqueName: \"kubernetes.io/projected/6e6c03f6-847b-402c-bfde-6dd30870b907-kube-api-access-6h6gj\") pod \"redhat-operators-s76xp\" (UID: \"6e6c03f6-847b-402c-bfde-6dd30870b907\") " pod="openshift-marketplace/redhat-operators-s76xp" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.292455 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.293089 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.298843 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.299134 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.302201 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-twq6q"] Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.309236 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.312963 4791 patch_prober.go:28] interesting pod/downloads-7954f5f757-rt865 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.313011 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-rt865" podUID="0522c983-dae6-41ca-807a-ff45912a0024" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.313417 4791 patch_prober.go:28] interesting pod/downloads-7954f5f757-rt865 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.313588 4791 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-rt865" podUID="0522c983-dae6-41ca-807a-ff45912a0024" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.350801 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s76xp" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.427858 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bj8pc"] Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.435153 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bj8pc" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.435339 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bj8pc"] Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.479083 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/28abe80d-37ae-45f7-abab-5bc2150e038e-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"28abe80d-37ae-45f7-abab-5bc2150e038e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.479233 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/28abe80d-37ae-45f7-abab-5bc2150e038e-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"28abe80d-37ae-45f7-abab-5bc2150e038e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.479413 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc437e64-8eee-418b-83d2-f79578cec0fe-catalog-content\") pod \"redhat-operators-bj8pc\" (UID: \"fc437e64-8eee-418b-83d2-f79578cec0fe\") " pod="openshift-marketplace/redhat-operators-bj8pc" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.479544 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngk4h\" (UniqueName: \"kubernetes.io/projected/fc437e64-8eee-418b-83d2-f79578cec0fe-kube-api-access-ngk4h\") pod \"redhat-operators-bj8pc\" (UID: \"fc437e64-8eee-418b-83d2-f79578cec0fe\") " pod="openshift-marketplace/redhat-operators-bj8pc" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.479713 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc437e64-8eee-418b-83d2-f79578cec0fe-utilities\") pod \"redhat-operators-bj8pc\" (UID: \"fc437e64-8eee-418b-83d2-f79578cec0fe\") " pod="openshift-marketplace/redhat-operators-bj8pc" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.580672 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngk4h\" (UniqueName: \"kubernetes.io/projected/fc437e64-8eee-418b-83d2-f79578cec0fe-kube-api-access-ngk4h\") pod \"redhat-operators-bj8pc\" (UID: \"fc437e64-8eee-418b-83d2-f79578cec0fe\") " pod="openshift-marketplace/redhat-operators-bj8pc" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.580741 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc437e64-8eee-418b-83d2-f79578cec0fe-utilities\") pod \"redhat-operators-bj8pc\" (UID: \"fc437e64-8eee-418b-83d2-f79578cec0fe\") " pod="openshift-marketplace/redhat-operators-bj8pc" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.580789 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/28abe80d-37ae-45f7-abab-5bc2150e038e-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"28abe80d-37ae-45f7-abab-5bc2150e038e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.580816 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/28abe80d-37ae-45f7-abab-5bc2150e038e-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"28abe80d-37ae-45f7-abab-5bc2150e038e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.580853 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc437e64-8eee-418b-83d2-f79578cec0fe-catalog-content\") pod \"redhat-operators-bj8pc\" (UID: \"fc437e64-8eee-418b-83d2-f79578cec0fe\") " pod="openshift-marketplace/redhat-operators-bj8pc" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.581061 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/28abe80d-37ae-45f7-abab-5bc2150e038e-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"28abe80d-37ae-45f7-abab-5bc2150e038e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.581527 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc437e64-8eee-418b-83d2-f79578cec0fe-utilities\") pod \"redhat-operators-bj8pc\" (UID: \"fc437e64-8eee-418b-83d2-f79578cec0fe\") " pod="openshift-marketplace/redhat-operators-bj8pc" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.581518 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc437e64-8eee-418b-83d2-f79578cec0fe-catalog-content\") pod \"redhat-operators-bj8pc\" (UID: \"fc437e64-8eee-418b-83d2-f79578cec0fe\") " pod="openshift-marketplace/redhat-operators-bj8pc" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.605005 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/28abe80d-37ae-45f7-abab-5bc2150e038e-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"28abe80d-37ae-45f7-abab-5bc2150e038e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.632772 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngk4h\" (UniqueName: \"kubernetes.io/projected/fc437e64-8eee-418b-83d2-f79578cec0fe-kube-api-access-ngk4h\") pod \"redhat-operators-bj8pc\" (UID: \"fc437e64-8eee-418b-83d2-f79578cec0fe\") " pod="openshift-marketplace/redhat-operators-bj8pc" Feb 17 00:08:14 crc kubenswrapper[4791]: E0217 00:08:14.635723 4791 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod509fdefa_10b0_4752_b844_843ed9e7106d.slice/crio-conmon-df626c649c84f3f24620a23cf5b0e1d66e1248e9b1655b34e6e8054df1deef98.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod509fdefa_10b0_4752_b844_843ed9e7106d.slice/crio-df626c649c84f3f24620a23cf5b0e1d66e1248e9b1655b34e6e8054df1deef98.scope\": RecentStats: unable to find data in memory cache]" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.666534 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.673764 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.679325 4791 patch_prober.go:28] interesting pod/router-default-5444994796-kt8q6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 00:08:14 crc kubenswrapper[4791]: [-]has-synced failed: reason withheld Feb 17 00:08:14 crc kubenswrapper[4791]: [+]process-running ok Feb 17 00:08:14 crc kubenswrapper[4791]: healthz check failed Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.679384 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kt8q6" podUID="459f3992-b770-44d7-9ecc-0ae8a228134f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.686362 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s76xp"] Feb 17 00:08:14 crc kubenswrapper[4791]: W0217 00:08:14.721242 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e6c03f6_847b_402c_bfde_6dd30870b907.slice/crio-d67dd9afc803b2b8cf537c2d990677521dc5d35daf8d8201c4e1dd9fc3670d22 WatchSource:0}: Error finding container d67dd9afc803b2b8cf537c2d990677521dc5d35daf8d8201c4e1dd9fc3670d22: Status 404 returned error can't find the container with id d67dd9afc803b2b8cf537c2d990677521dc5d35daf8d8201c4e1dd9fc3670d22 Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.760506 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bj8pc" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.940495 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 17 00:08:15 crc kubenswrapper[4791]: I0217 00:08:15.019961 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-bfffb" Feb 17 00:08:15 crc kubenswrapper[4791]: I0217 00:08:15.027407 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bj8pc"] Feb 17 00:08:15 crc kubenswrapper[4791]: W0217 00:08:15.065280 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc437e64_8eee_418b_83d2_f79578cec0fe.slice/crio-66dcd55652a7b7d244cc261ad14bf94f214d23f47f292c66da7aede063b2653b WatchSource:0}: Error finding container 66dcd55652a7b7d244cc261ad14bf94f214d23f47f292c66da7aede063b2653b: Status 404 returned error can't find the container with id 66dcd55652a7b7d244cc261ad14bf94f214d23f47f292c66da7aede063b2653b Feb 17 00:08:15 crc kubenswrapper[4791]: I0217 00:08:15.285806 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bj8pc" event={"ID":"fc437e64-8eee-418b-83d2-f79578cec0fe","Type":"ContainerStarted","Data":"4c3e1e8da3848cbfda653b16270111f1411ce1e227ddf3a4156d7065874d5fdb"} Feb 17 00:08:15 crc kubenswrapper[4791]: I0217 00:08:15.285860 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bj8pc" event={"ID":"fc437e64-8eee-418b-83d2-f79578cec0fe","Type":"ContainerStarted","Data":"66dcd55652a7b7d244cc261ad14bf94f214d23f47f292c66da7aede063b2653b"} Feb 17 00:08:15 crc kubenswrapper[4791]: I0217 00:08:15.290017 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"28abe80d-37ae-45f7-abab-5bc2150e038e","Type":"ContainerStarted","Data":"dd5583159b97d008658f84cf600c20f343a255c6d03a482c074c8e01afed6de8"} Feb 17 00:08:15 crc kubenswrapper[4791]: I0217 00:08:15.294172 4791 generic.go:334] "Generic (PLEG): container finished" podID="8b169853-1972-4cb9-9a80-159b0b3456fa" containerID="c3c101c94bb03448c1b4eb251a0fab2366de8d1d9d41e50fd26d9559d10d5bd0" exitCode=0 Feb 17 00:08:15 crc kubenswrapper[4791]: I0217 00:08:15.294431 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8b169853-1972-4cb9-9a80-159b0b3456fa","Type":"ContainerDied","Data":"c3c101c94bb03448c1b4eb251a0fab2366de8d1d9d41e50fd26d9559d10d5bd0"} Feb 17 00:08:15 crc kubenswrapper[4791]: I0217 00:08:15.303157 4791 generic.go:334] "Generic (PLEG): container finished" podID="6e6c03f6-847b-402c-bfde-6dd30870b907" containerID="ce53e59e96ea0a38e6dcc5c750617796d07e5b87c6becde6e266b92875a433c8" exitCode=0 Feb 17 00:08:15 crc kubenswrapper[4791]: I0217 00:08:15.303236 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s76xp" event={"ID":"6e6c03f6-847b-402c-bfde-6dd30870b907","Type":"ContainerDied","Data":"ce53e59e96ea0a38e6dcc5c750617796d07e5b87c6becde6e266b92875a433c8"} Feb 17 00:08:15 crc kubenswrapper[4791]: I0217 00:08:15.303284 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s76xp" event={"ID":"6e6c03f6-847b-402c-bfde-6dd30870b907","Type":"ContainerStarted","Data":"d67dd9afc803b2b8cf537c2d990677521dc5d35daf8d8201c4e1dd9fc3670d22"} Feb 17 00:08:15 crc kubenswrapper[4791]: I0217 00:08:15.311647 4791 generic.go:334] "Generic (PLEG): container finished" podID="509fdefa-10b0-4752-b844-843ed9e7106d" containerID="df626c649c84f3f24620a23cf5b0e1d66e1248e9b1655b34e6e8054df1deef98" exitCode=0 Feb 17 00:08:15 crc kubenswrapper[4791]: I0217 00:08:15.311687 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-twq6q" event={"ID":"509fdefa-10b0-4752-b844-843ed9e7106d","Type":"ContainerDied","Data":"df626c649c84f3f24620a23cf5b0e1d66e1248e9b1655b34e6e8054df1deef98"} Feb 17 00:08:15 crc kubenswrapper[4791]: I0217 00:08:15.311714 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-twq6q" event={"ID":"509fdefa-10b0-4752-b844-843ed9e7106d","Type":"ContainerStarted","Data":"10bc712141e232e5d7064d897763d2f2a15a17f24973eb264bc82a8e5f9eb39a"} Feb 17 00:08:15 crc kubenswrapper[4791]: I0217 00:08:15.673947 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 00:08:15 crc kubenswrapper[4791]: I0217 00:08:15.684601 4791 patch_prober.go:28] interesting pod/router-default-5444994796-kt8q6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 00:08:15 crc kubenswrapper[4791]: [-]has-synced failed: reason withheld Feb 17 00:08:15 crc kubenswrapper[4791]: [+]process-running ok Feb 17 00:08:15 crc kubenswrapper[4791]: healthz check failed Feb 17 00:08:15 crc kubenswrapper[4791]: I0217 00:08:15.684676 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kt8q6" podUID="459f3992-b770-44d7-9ecc-0ae8a228134f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 00:08:15 crc kubenswrapper[4791]: I0217 00:08:15.710879 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b169853-1972-4cb9-9a80-159b0b3456fa-kube-api-access\") pod \"8b169853-1972-4cb9-9a80-159b0b3456fa\" (UID: \"8b169853-1972-4cb9-9a80-159b0b3456fa\") " Feb 17 00:08:15 crc kubenswrapper[4791]: I0217 00:08:15.711035 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8b169853-1972-4cb9-9a80-159b0b3456fa-kubelet-dir\") pod \"8b169853-1972-4cb9-9a80-159b0b3456fa\" (UID: \"8b169853-1972-4cb9-9a80-159b0b3456fa\") " Feb 17 00:08:15 crc kubenswrapper[4791]: I0217 00:08:15.711352 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b169853-1972-4cb9-9a80-159b0b3456fa-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8b169853-1972-4cb9-9a80-159b0b3456fa" (UID: "8b169853-1972-4cb9-9a80-159b0b3456fa"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:08:15 crc kubenswrapper[4791]: I0217 00:08:15.716046 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b169853-1972-4cb9-9a80-159b0b3456fa-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8b169853-1972-4cb9-9a80-159b0b3456fa" (UID: "8b169853-1972-4cb9-9a80-159b0b3456fa"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:08:15 crc kubenswrapper[4791]: I0217 00:08:15.811741 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b169853-1972-4cb9-9a80-159b0b3456fa-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 00:08:15 crc kubenswrapper[4791]: I0217 00:08:15.811782 4791 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8b169853-1972-4cb9-9a80-159b0b3456fa-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 17 00:08:16 crc kubenswrapper[4791]: I0217 00:08:16.330103 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8b169853-1972-4cb9-9a80-159b0b3456fa","Type":"ContainerDied","Data":"6dac1b8347eda9b35ac38349a12edc6d86d8bfa6b545b17f2fe10256ba56df85"} Feb 17 00:08:16 crc kubenswrapper[4791]: I0217 00:08:16.330150 4791 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6dac1b8347eda9b35ac38349a12edc6d86d8bfa6b545b17f2fe10256ba56df85" Feb 17 00:08:16 crc kubenswrapper[4791]: I0217 00:08:16.330257 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 00:08:16 crc kubenswrapper[4791]: I0217 00:08:16.361007 4791 generic.go:334] "Generic (PLEG): container finished" podID="fc437e64-8eee-418b-83d2-f79578cec0fe" containerID="4c3e1e8da3848cbfda653b16270111f1411ce1e227ddf3a4156d7065874d5fdb" exitCode=0 Feb 17 00:08:16 crc kubenswrapper[4791]: I0217 00:08:16.361072 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bj8pc" event={"ID":"fc437e64-8eee-418b-83d2-f79578cec0fe","Type":"ContainerDied","Data":"4c3e1e8da3848cbfda653b16270111f1411ce1e227ddf3a4156d7065874d5fdb"} Feb 17 00:08:16 crc kubenswrapper[4791]: I0217 00:08:16.364571 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"28abe80d-37ae-45f7-abab-5bc2150e038e","Type":"ContainerStarted","Data":"47bb1560d306fcdd076335cbd39df3e7b038c57fdd7600efd790c8af4a167710"} Feb 17 00:08:16 crc kubenswrapper[4791]: I0217 00:08:16.676080 4791 patch_prober.go:28] interesting pod/router-default-5444994796-kt8q6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 00:08:16 crc kubenswrapper[4791]: [-]has-synced failed: reason withheld Feb 17 00:08:16 crc kubenswrapper[4791]: [+]process-running ok Feb 17 00:08:16 crc kubenswrapper[4791]: healthz check failed Feb 17 00:08:16 crc kubenswrapper[4791]: I0217 00:08:16.676201 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kt8q6" podUID="459f3992-b770-44d7-9ecc-0ae8a228134f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 00:08:17 crc kubenswrapper[4791]: I0217 00:08:17.059237 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-4chtt" Feb 17 00:08:17 crc kubenswrapper[4791]: I0217 00:08:17.395746 4791 generic.go:334] "Generic (PLEG): container finished" podID="28abe80d-37ae-45f7-abab-5bc2150e038e" containerID="47bb1560d306fcdd076335cbd39df3e7b038c57fdd7600efd790c8af4a167710" exitCode=0 Feb 17 00:08:17 crc kubenswrapper[4791]: I0217 00:08:17.395791 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"28abe80d-37ae-45f7-abab-5bc2150e038e","Type":"ContainerDied","Data":"47bb1560d306fcdd076335cbd39df3e7b038c57fdd7600efd790c8af4a167710"} Feb 17 00:08:17 crc kubenswrapper[4791]: I0217 00:08:17.678364 4791 patch_prober.go:28] interesting pod/router-default-5444994796-kt8q6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 00:08:17 crc kubenswrapper[4791]: [-]has-synced failed: reason withheld Feb 17 00:08:17 crc kubenswrapper[4791]: [+]process-running ok Feb 17 00:08:17 crc kubenswrapper[4791]: healthz check failed Feb 17 00:08:17 crc kubenswrapper[4791]: I0217 00:08:17.678422 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kt8q6" podUID="459f3992-b770-44d7-9ecc-0ae8a228134f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 00:08:17 crc kubenswrapper[4791]: I0217 00:08:17.714492 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 00:08:17 crc kubenswrapper[4791]: I0217 00:08:17.754273 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/28abe80d-37ae-45f7-abab-5bc2150e038e-kubelet-dir\") pod \"28abe80d-37ae-45f7-abab-5bc2150e038e\" (UID: \"28abe80d-37ae-45f7-abab-5bc2150e038e\") " Feb 17 00:08:17 crc kubenswrapper[4791]: I0217 00:08:17.754372 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/28abe80d-37ae-45f7-abab-5bc2150e038e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "28abe80d-37ae-45f7-abab-5bc2150e038e" (UID: "28abe80d-37ae-45f7-abab-5bc2150e038e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:08:17 crc kubenswrapper[4791]: I0217 00:08:17.754559 4791 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/28abe80d-37ae-45f7-abab-5bc2150e038e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 17 00:08:17 crc kubenswrapper[4791]: I0217 00:08:17.855031 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/28abe80d-37ae-45f7-abab-5bc2150e038e-kube-api-access\") pod \"28abe80d-37ae-45f7-abab-5bc2150e038e\" (UID: \"28abe80d-37ae-45f7-abab-5bc2150e038e\") " Feb 17 00:08:17 crc kubenswrapper[4791]: I0217 00:08:17.878585 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28abe80d-37ae-45f7-abab-5bc2150e038e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "28abe80d-37ae-45f7-abab-5bc2150e038e" (UID: "28abe80d-37ae-45f7-abab-5bc2150e038e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:08:17 crc kubenswrapper[4791]: I0217 00:08:17.957093 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/28abe80d-37ae-45f7-abab-5bc2150e038e-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 00:08:18 crc kubenswrapper[4791]: I0217 00:08:18.427452 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"28abe80d-37ae-45f7-abab-5bc2150e038e","Type":"ContainerDied","Data":"dd5583159b97d008658f84cf600c20f343a255c6d03a482c074c8e01afed6de8"} Feb 17 00:08:18 crc kubenswrapper[4791]: I0217 00:08:18.427495 4791 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd5583159b97d008658f84cf600c20f343a255c6d03a482c074c8e01afed6de8" Feb 17 00:08:18 crc kubenswrapper[4791]: I0217 00:08:18.427511 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 00:08:18 crc kubenswrapper[4791]: I0217 00:08:18.689583 4791 patch_prober.go:28] interesting pod/router-default-5444994796-kt8q6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 00:08:18 crc kubenswrapper[4791]: [-]has-synced failed: reason withheld Feb 17 00:08:18 crc kubenswrapper[4791]: [+]process-running ok Feb 17 00:08:18 crc kubenswrapper[4791]: healthz check failed Feb 17 00:08:18 crc kubenswrapper[4791]: I0217 00:08:18.689652 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kt8q6" podUID="459f3992-b770-44d7-9ecc-0ae8a228134f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 00:08:19 crc kubenswrapper[4791]: I0217 00:08:19.675550 4791 patch_prober.go:28] interesting pod/router-default-5444994796-kt8q6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 00:08:19 crc kubenswrapper[4791]: [-]has-synced failed: reason withheld Feb 17 00:08:19 crc kubenswrapper[4791]: [+]process-running ok Feb 17 00:08:19 crc kubenswrapper[4791]: healthz check failed Feb 17 00:08:19 crc kubenswrapper[4791]: I0217 00:08:19.675614 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kt8q6" podUID="459f3992-b770-44d7-9ecc-0ae8a228134f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 00:08:20 crc kubenswrapper[4791]: I0217 00:08:20.677212 4791 patch_prober.go:28] interesting pod/router-default-5444994796-kt8q6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 00:08:20 crc kubenswrapper[4791]: [-]has-synced failed: reason withheld Feb 17 00:08:20 crc kubenswrapper[4791]: [+]process-running ok Feb 17 00:08:20 crc kubenswrapper[4791]: healthz check failed Feb 17 00:08:20 crc kubenswrapper[4791]: I0217 00:08:20.677606 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kt8q6" podUID="459f3992-b770-44d7-9ecc-0ae8a228134f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 00:08:21 crc kubenswrapper[4791]: I0217 00:08:21.676730 4791 patch_prober.go:28] interesting pod/router-default-5444994796-kt8q6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 00:08:21 crc kubenswrapper[4791]: [+]has-synced ok Feb 17 00:08:21 crc kubenswrapper[4791]: [+]process-running ok Feb 17 00:08:21 crc kubenswrapper[4791]: healthz check failed Feb 17 00:08:21 crc kubenswrapper[4791]: I0217 00:08:21.676795 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kt8q6" podUID="459f3992-b770-44d7-9ecc-0ae8a228134f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 00:08:22 crc kubenswrapper[4791]: I0217 00:08:22.676782 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:22 crc kubenswrapper[4791]: I0217 00:08:22.679343 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:24 crc kubenswrapper[4791]: I0217 00:08:24.135992 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:24 crc kubenswrapper[4791]: I0217 00:08:24.140140 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:24 crc kubenswrapper[4791]: I0217 00:08:24.315697 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-rt865" Feb 17 00:08:24 crc kubenswrapper[4791]: I0217 00:08:24.972938 4791 patch_prober.go:28] interesting pod/machine-config-daemon-9klkw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 00:08:24 crc kubenswrapper[4791]: I0217 00:08:24.972999 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 00:08:25 crc kubenswrapper[4791]: I0217 00:08:25.692063 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs\") pod \"network-metrics-daemon-6x28n\" (UID: \"1d97cf45-2324-494c-839f-6f264eba3828\") " pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:08:25 crc kubenswrapper[4791]: I0217 00:08:25.699862 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs\") pod \"network-metrics-daemon-6x28n\" (UID: \"1d97cf45-2324-494c-839f-6f264eba3828\") " pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:08:25 crc kubenswrapper[4791]: I0217 00:08:25.848825 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:08:27 crc kubenswrapper[4791]: I0217 00:08:27.491873 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6x28n"] Feb 17 00:08:31 crc kubenswrapper[4791]: I0217 00:08:31.104133 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:32 crc kubenswrapper[4791]: W0217 00:08:32.104179 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d97cf45_2324_494c_839f_6f264eba3828.slice/crio-2593a58bff4cf0c8b37c180a661d10df1583c5c2555073df032a3a53ba8b392e WatchSource:0}: Error finding container 2593a58bff4cf0c8b37c180a661d10df1583c5c2555073df032a3a53ba8b392e: Status 404 returned error can't find the container with id 2593a58bff4cf0c8b37c180a661d10df1583c5c2555073df032a3a53ba8b392e Feb 17 00:08:32 crc kubenswrapper[4791]: I0217 00:08:32.523563 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6x28n" event={"ID":"1d97cf45-2324-494c-839f-6f264eba3828","Type":"ContainerStarted","Data":"2593a58bff4cf0c8b37c180a661d10df1583c5c2555073df032a3a53ba8b392e"} Feb 17 00:08:38 crc kubenswrapper[4791]: I0217 00:08:38.553518 4791 generic.go:334] "Generic (PLEG): container finished" podID="94401a93-55c7-4e8b-83f7-dc27a876f335" containerID="1bf210069f01dcf3433075dd8a895405951d971de359016b4eb9aa868416c26a" exitCode=0 Feb 17 00:08:38 crc kubenswrapper[4791]: I0217 00:08:38.553618 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29521440-k6f7k" event={"ID":"94401a93-55c7-4e8b-83f7-dc27a876f335","Type":"ContainerDied","Data":"1bf210069f01dcf3433075dd8a895405951d971de359016b4eb9aa868416c26a"} Feb 17 00:08:41 crc kubenswrapper[4791]: I0217 00:08:41.247284 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:08:42 crc kubenswrapper[4791]: E0217 00:08:42.747602 4791 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 17 00:08:42 crc kubenswrapper[4791]: E0217 00:08:42.749691 4791 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ndp5p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-tt7zf_openshift-marketplace(9895217e-9934-4f80-a583-98842d597690): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 00:08:42 crc kubenswrapper[4791]: E0217 00:08:42.751063 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-tt7zf" podUID="9895217e-9934-4f80-a583-98842d597690" Feb 17 00:08:43 crc kubenswrapper[4791]: E0217 00:08:43.770233 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-tt7zf" podUID="9895217e-9934-4f80-a583-98842d597690" Feb 17 00:08:43 crc kubenswrapper[4791]: I0217 00:08:43.812484 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29521440-k6f7k" Feb 17 00:08:43 crc kubenswrapper[4791]: E0217 00:08:43.876206 4791 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 17 00:08:43 crc kubenswrapper[4791]: E0217 00:08:43.876332 4791 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2txwc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-h66xr_openshift-marketplace(db1caaaf-7e8b-405c-97ff-7c507f068688): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 00:08:43 crc kubenswrapper[4791]: E0217 00:08:43.877435 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-h66xr" podUID="db1caaaf-7e8b-405c-97ff-7c507f068688" Feb 17 00:08:43 crc kubenswrapper[4791]: E0217 00:08:43.919036 4791 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 17 00:08:43 crc kubenswrapper[4791]: E0217 00:08:43.919200 4791 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rq9sd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-cgmd4_openshift-marketplace(48855520-658c-4579-a867-7e984bce56c7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 00:08:43 crc kubenswrapper[4791]: E0217 00:08:43.921001 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-cgmd4" podUID="48855520-658c-4579-a867-7e984bce56c7" Feb 17 00:08:43 crc kubenswrapper[4791]: E0217 00:08:43.937567 4791 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 17 00:08:43 crc kubenswrapper[4791]: E0217 00:08:43.937692 4791 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8hwtv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-twq6q_openshift-marketplace(509fdefa-10b0-4752-b844-843ed9e7106d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 00:08:43 crc kubenswrapper[4791]: E0217 00:08:43.939007 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-twq6q" podUID="509fdefa-10b0-4752-b844-843ed9e7106d" Feb 17 00:08:43 crc kubenswrapper[4791]: I0217 00:08:43.948034 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/94401a93-55c7-4e8b-83f7-dc27a876f335-serviceca\") pod \"94401a93-55c7-4e8b-83f7-dc27a876f335\" (UID: \"94401a93-55c7-4e8b-83f7-dc27a876f335\") " Feb 17 00:08:43 crc kubenswrapper[4791]: I0217 00:08:43.948173 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgt9z\" (UniqueName: \"kubernetes.io/projected/94401a93-55c7-4e8b-83f7-dc27a876f335-kube-api-access-cgt9z\") pod \"94401a93-55c7-4e8b-83f7-dc27a876f335\" (UID: \"94401a93-55c7-4e8b-83f7-dc27a876f335\") " Feb 17 00:08:43 crc kubenswrapper[4791]: I0217 00:08:43.949426 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94401a93-55c7-4e8b-83f7-dc27a876f335-serviceca" (OuterVolumeSpecName: "serviceca") pod "94401a93-55c7-4e8b-83f7-dc27a876f335" (UID: "94401a93-55c7-4e8b-83f7-dc27a876f335"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:08:43 crc kubenswrapper[4791]: I0217 00:08:43.954574 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94401a93-55c7-4e8b-83f7-dc27a876f335-kube-api-access-cgt9z" (OuterVolumeSpecName: "kube-api-access-cgt9z") pod "94401a93-55c7-4e8b-83f7-dc27a876f335" (UID: "94401a93-55c7-4e8b-83f7-dc27a876f335"). InnerVolumeSpecName "kube-api-access-cgt9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:08:44 crc kubenswrapper[4791]: I0217 00:08:44.049447 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgt9z\" (UniqueName: \"kubernetes.io/projected/94401a93-55c7-4e8b-83f7-dc27a876f335-kube-api-access-cgt9z\") on node \"crc\" DevicePath \"\"" Feb 17 00:08:44 crc kubenswrapper[4791]: I0217 00:08:44.049479 4791 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/94401a93-55c7-4e8b-83f7-dc27a876f335-serviceca\") on node \"crc\" DevicePath \"\"" Feb 17 00:08:44 crc kubenswrapper[4791]: I0217 00:08:44.587086 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s76xp" event={"ID":"6e6c03f6-847b-402c-bfde-6dd30870b907","Type":"ContainerStarted","Data":"f8f32a7e13221c829def6748e59d8da3ed3ad1b8cb9d0ac30deda15bc24545d5"} Feb 17 00:08:44 crc kubenswrapper[4791]: I0217 00:08:44.589809 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29521440-k6f7k" Feb 17 00:08:44 crc kubenswrapper[4791]: I0217 00:08:44.589809 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29521440-k6f7k" event={"ID":"94401a93-55c7-4e8b-83f7-dc27a876f335","Type":"ContainerDied","Data":"142ab3f00f5019c1be7909e2cdfb055edb77c1758b592a88fa5914ec63b0bda2"} Feb 17 00:08:44 crc kubenswrapper[4791]: I0217 00:08:44.590136 4791 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="142ab3f00f5019c1be7909e2cdfb055edb77c1758b592a88fa5914ec63b0bda2" Feb 17 00:08:44 crc kubenswrapper[4791]: I0217 00:08:44.592000 4791 generic.go:334] "Generic (PLEG): container finished" podID="9e98ac61-7140-4c15-8c29-47676734a52d" containerID="b0d6d012f61c85279eddffc3b1f7adb9dc86b4b4ba7e62c78a6a02431b529533" exitCode=0 Feb 17 00:08:44 crc kubenswrapper[4791]: I0217 00:08:44.592093 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9dcp" event={"ID":"9e98ac61-7140-4c15-8c29-47676734a52d","Type":"ContainerDied","Data":"b0d6d012f61c85279eddffc3b1f7adb9dc86b4b4ba7e62c78a6a02431b529533"} Feb 17 00:08:44 crc kubenswrapper[4791]: I0217 00:08:44.594635 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bj8pc" event={"ID":"fc437e64-8eee-418b-83d2-f79578cec0fe","Type":"ContainerStarted","Data":"08a68b5f0e7fdf02d507ba4fd7e4b03827c1f9b663dd4ab621eef2f40ab4ac2d"} Feb 17 00:08:44 crc kubenswrapper[4791]: I0217 00:08:44.597044 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6x28n" event={"ID":"1d97cf45-2324-494c-839f-6f264eba3828","Type":"ContainerStarted","Data":"e4a107bdf67b5b0b3b7b726d00831fb2f524244233c4f8ac2f8ddc8c2767337e"} Feb 17 00:08:44 crc kubenswrapper[4791]: I0217 00:08:44.597088 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6x28n" event={"ID":"1d97cf45-2324-494c-839f-6f264eba3828","Type":"ContainerStarted","Data":"0607d177a5164d672755abaafdf40881feea71c0f29f6a2d4ff504d3445deaf4"} Feb 17 00:08:44 crc kubenswrapper[4791]: I0217 00:08:44.611741 4791 generic.go:334] "Generic (PLEG): container finished" podID="f04d6e19-5c11-4527-8a49-3208098d2575" containerID="2517c8370384ba83de2ad6cdce97d784d0d2c4299af3c5e90d0afd18716185e2" exitCode=0 Feb 17 00:08:44 crc kubenswrapper[4791]: I0217 00:08:44.611959 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8xbcp" event={"ID":"f04d6e19-5c11-4527-8a49-3208098d2575","Type":"ContainerDied","Data":"2517c8370384ba83de2ad6cdce97d784d0d2c4299af3c5e90d0afd18716185e2"} Feb 17 00:08:44 crc kubenswrapper[4791]: E0217 00:08:44.613719 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-h66xr" podUID="db1caaaf-7e8b-405c-97ff-7c507f068688" Feb 17 00:08:44 crc kubenswrapper[4791]: E0217 00:08:44.614886 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-twq6q" podUID="509fdefa-10b0-4752-b844-843ed9e7106d" Feb 17 00:08:44 crc kubenswrapper[4791]: E0217 00:08:44.614994 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-cgmd4" podUID="48855520-658c-4579-a867-7e984bce56c7" Feb 17 00:08:44 crc kubenswrapper[4791]: I0217 00:08:44.668223 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r2rv7" Feb 17 00:08:44 crc kubenswrapper[4791]: I0217 00:08:44.704447 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-6x28n" podStartSLOduration=162.704406689 podStartE2EDuration="2m42.704406689s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:44.697339602 +0000 UTC m=+182.176852139" watchObservedRunningTime="2026-02-17 00:08:44.704406689 +0000 UTC m=+182.183919226" Feb 17 00:08:45 crc kubenswrapper[4791]: I0217 00:08:45.617692 4791 generic.go:334] "Generic (PLEG): container finished" podID="fc437e64-8eee-418b-83d2-f79578cec0fe" containerID="08a68b5f0e7fdf02d507ba4fd7e4b03827c1f9b663dd4ab621eef2f40ab4ac2d" exitCode=0 Feb 17 00:08:45 crc kubenswrapper[4791]: I0217 00:08:45.617747 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bj8pc" event={"ID":"fc437e64-8eee-418b-83d2-f79578cec0fe","Type":"ContainerDied","Data":"08a68b5f0e7fdf02d507ba4fd7e4b03827c1f9b663dd4ab621eef2f40ab4ac2d"} Feb 17 00:08:45 crc kubenswrapper[4791]: I0217 00:08:45.623820 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8xbcp" event={"ID":"f04d6e19-5c11-4527-8a49-3208098d2575","Type":"ContainerStarted","Data":"0221bbce827624c6e65868b10fe478e0cd26d718a850ba8cedf072c22010d1a5"} Feb 17 00:08:45 crc kubenswrapper[4791]: I0217 00:08:45.628925 4791 generic.go:334] "Generic (PLEG): container finished" podID="6e6c03f6-847b-402c-bfde-6dd30870b907" containerID="f8f32a7e13221c829def6748e59d8da3ed3ad1b8cb9d0ac30deda15bc24545d5" exitCode=0 Feb 17 00:08:45 crc kubenswrapper[4791]: I0217 00:08:45.628973 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s76xp" event={"ID":"6e6c03f6-847b-402c-bfde-6dd30870b907","Type":"ContainerDied","Data":"f8f32a7e13221c829def6748e59d8da3ed3ad1b8cb9d0ac30deda15bc24545d5"} Feb 17 00:08:45 crc kubenswrapper[4791]: I0217 00:08:45.634799 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9dcp" event={"ID":"9e98ac61-7140-4c15-8c29-47676734a52d","Type":"ContainerStarted","Data":"5f1f15c58396e8a0d372d8b6e3158f29e686530d20a25a9aa3576e031d69ab9e"} Feb 17 00:08:45 crc kubenswrapper[4791]: I0217 00:08:45.714081 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b9dcp" podStartSLOduration=2.894046635 podStartE2EDuration="34.714059251s" podCreationTimestamp="2026-02-17 00:08:11 +0000 UTC" firstStartedPulling="2026-02-17 00:08:13.191030832 +0000 UTC m=+150.670543359" lastFinishedPulling="2026-02-17 00:08:45.011043448 +0000 UTC m=+182.490555975" observedRunningTime="2026-02-17 00:08:45.691726923 +0000 UTC m=+183.171239450" watchObservedRunningTime="2026-02-17 00:08:45.714059251 +0000 UTC m=+183.193571798" Feb 17 00:08:45 crc kubenswrapper[4791]: I0217 00:08:45.714935 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8xbcp" podStartSLOduration=1.6209202120000001 podStartE2EDuration="34.714926579s" podCreationTimestamp="2026-02-17 00:08:11 +0000 UTC" firstStartedPulling="2026-02-17 00:08:12.121635119 +0000 UTC m=+149.601147646" lastFinishedPulling="2026-02-17 00:08:45.215641476 +0000 UTC m=+182.695154013" observedRunningTime="2026-02-17 00:08:45.713229455 +0000 UTC m=+183.192742002" watchObservedRunningTime="2026-02-17 00:08:45.714926579 +0000 UTC m=+183.194439126" Feb 17 00:08:46 crc kubenswrapper[4791]: I0217 00:08:46.648236 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bj8pc" event={"ID":"fc437e64-8eee-418b-83d2-f79578cec0fe","Type":"ContainerStarted","Data":"d649ea334bf454488556fe2c70b06ff12d86f88ff25ffa7b4f00f581346ea820"} Feb 17 00:08:46 crc kubenswrapper[4791]: I0217 00:08:46.652261 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s76xp" event={"ID":"6e6c03f6-847b-402c-bfde-6dd30870b907","Type":"ContainerStarted","Data":"33c952b3bdd673e3f38be6e9330aa80bd1a4cfda1479b24aa731cd17ec616cc1"} Feb 17 00:08:46 crc kubenswrapper[4791]: I0217 00:08:46.668322 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bj8pc" podStartSLOduration=2.974785819 podStartE2EDuration="32.668301103s" podCreationTimestamp="2026-02-17 00:08:14 +0000 UTC" firstStartedPulling="2026-02-17 00:08:16.362303486 +0000 UTC m=+153.841816013" lastFinishedPulling="2026-02-17 00:08:46.05581877 +0000 UTC m=+183.535331297" observedRunningTime="2026-02-17 00:08:46.664347216 +0000 UTC m=+184.143859743" watchObservedRunningTime="2026-02-17 00:08:46.668301103 +0000 UTC m=+184.147813630" Feb 17 00:08:46 crc kubenswrapper[4791]: I0217 00:08:46.690904 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-s76xp" podStartSLOduration=1.965649642 podStartE2EDuration="32.690888149s" podCreationTimestamp="2026-02-17 00:08:14 +0000 UTC" firstStartedPulling="2026-02-17 00:08:15.30715205 +0000 UTC m=+152.786664577" lastFinishedPulling="2026-02-17 00:08:46.032390567 +0000 UTC m=+183.511903084" observedRunningTime="2026-02-17 00:08:46.688620626 +0000 UTC m=+184.168133153" watchObservedRunningTime="2026-02-17 00:08:46.690888149 +0000 UTC m=+184.170400676" Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.348323 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8xbcp" Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.348668 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8xbcp" Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.482170 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 17 00:08:51 crc kubenswrapper[4791]: E0217 00:08:51.482436 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28abe80d-37ae-45f7-abab-5bc2150e038e" containerName="pruner" Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.482451 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="28abe80d-37ae-45f7-abab-5bc2150e038e" containerName="pruner" Feb 17 00:08:51 crc kubenswrapper[4791]: E0217 00:08:51.482468 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94401a93-55c7-4e8b-83f7-dc27a876f335" containerName="image-pruner" Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.482477 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="94401a93-55c7-4e8b-83f7-dc27a876f335" containerName="image-pruner" Feb 17 00:08:51 crc kubenswrapper[4791]: E0217 00:08:51.482498 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b169853-1972-4cb9-9a80-159b0b3456fa" containerName="pruner" Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.482508 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b169853-1972-4cb9-9a80-159b0b3456fa" containerName="pruner" Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.482637 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="94401a93-55c7-4e8b-83f7-dc27a876f335" containerName="image-pruner" Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.482652 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="28abe80d-37ae-45f7-abab-5bc2150e038e" containerName="pruner" Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.482664 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b169853-1972-4cb9-9a80-159b0b3456fa" containerName="pruner" Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.483085 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.485057 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.485877 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.494554 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.558693 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8xbcp" Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.653193 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e0995bd5-bc3f-4011-b6ca-ee70c4d84798-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e0995bd5-bc3f-4011-b6ca-ee70c4d84798\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.653773 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e0995bd5-bc3f-4011-b6ca-ee70c4d84798-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e0995bd5-bc3f-4011-b6ca-ee70c4d84798\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.722998 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8xbcp" Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.754755 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e0995bd5-bc3f-4011-b6ca-ee70c4d84798-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e0995bd5-bc3f-4011-b6ca-ee70c4d84798\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.754808 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e0995bd5-bc3f-4011-b6ca-ee70c4d84798-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e0995bd5-bc3f-4011-b6ca-ee70c4d84798\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.754874 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e0995bd5-bc3f-4011-b6ca-ee70c4d84798-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e0995bd5-bc3f-4011-b6ca-ee70c4d84798\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.780671 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e0995bd5-bc3f-4011-b6ca-ee70c4d84798-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e0995bd5-bc3f-4011-b6ca-ee70c4d84798\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.797265 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.812329 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b9dcp" Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.812363 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b9dcp" Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.875361 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b9dcp" Feb 17 00:08:52 crc kubenswrapper[4791]: I0217 00:08:52.062451 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 17 00:08:52 crc kubenswrapper[4791]: W0217 00:08:52.076902 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode0995bd5_bc3f_4011_b6ca_ee70c4d84798.slice/crio-97c89836b71ff7d21a5271a0fc522b64b958612bc0508d95b9f8ef20888e934d WatchSource:0}: Error finding container 97c89836b71ff7d21a5271a0fc522b64b958612bc0508d95b9f8ef20888e934d: Status 404 returned error can't find the container with id 97c89836b71ff7d21a5271a0fc522b64b958612bc0508d95b9f8ef20888e934d Feb 17 00:08:52 crc kubenswrapper[4791]: I0217 00:08:52.691356 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"e0995bd5-bc3f-4011-b6ca-ee70c4d84798","Type":"ContainerStarted","Data":"97c89836b71ff7d21a5271a0fc522b64b958612bc0508d95b9f8ef20888e934d"} Feb 17 00:08:52 crc kubenswrapper[4791]: I0217 00:08:52.728328 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b9dcp" Feb 17 00:08:53 crc kubenswrapper[4791]: I0217 00:08:53.157232 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r8zpf"] Feb 17 00:08:53 crc kubenswrapper[4791]: I0217 00:08:53.629402 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b9dcp"] Feb 17 00:08:53 crc kubenswrapper[4791]: I0217 00:08:53.697819 4791 generic.go:334] "Generic (PLEG): container finished" podID="e0995bd5-bc3f-4011-b6ca-ee70c4d84798" containerID="3fa5b59208b65017a2be0da1d6bb8489fd1d5ca69ba90c98d7179b98a37399b3" exitCode=0 Feb 17 00:08:53 crc kubenswrapper[4791]: I0217 00:08:53.697927 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"e0995bd5-bc3f-4011-b6ca-ee70c4d84798","Type":"ContainerDied","Data":"3fa5b59208b65017a2be0da1d6bb8489fd1d5ca69ba90c98d7179b98a37399b3"} Feb 17 00:08:54 crc kubenswrapper[4791]: I0217 00:08:54.352043 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-s76xp" Feb 17 00:08:54 crc kubenswrapper[4791]: I0217 00:08:54.352124 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-s76xp" Feb 17 00:08:54 crc kubenswrapper[4791]: I0217 00:08:54.406094 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-s76xp" Feb 17 00:08:54 crc kubenswrapper[4791]: I0217 00:08:54.703920 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-b9dcp" podUID="9e98ac61-7140-4c15-8c29-47676734a52d" containerName="registry-server" containerID="cri-o://5f1f15c58396e8a0d372d8b6e3158f29e686530d20a25a9aa3576e031d69ab9e" gracePeriod=2 Feb 17 00:08:54 crc kubenswrapper[4791]: I0217 00:08:54.748251 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-s76xp" Feb 17 00:08:54 crc kubenswrapper[4791]: I0217 00:08:54.761258 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bj8pc" Feb 17 00:08:54 crc kubenswrapper[4791]: I0217 00:08:54.761478 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bj8pc" Feb 17 00:08:54 crc kubenswrapper[4791]: I0217 00:08:54.807513 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bj8pc" Feb 17 00:08:54 crc kubenswrapper[4791]: I0217 00:08:54.956709 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 00:08:54 crc kubenswrapper[4791]: I0217 00:08:54.972780 4791 patch_prober.go:28] interesting pod/machine-config-daemon-9klkw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 00:08:54 crc kubenswrapper[4791]: I0217 00:08:54.972856 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.060694 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b9dcp" Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.133787 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e0995bd5-bc3f-4011-b6ca-ee70c4d84798-kube-api-access\") pod \"e0995bd5-bc3f-4011-b6ca-ee70c4d84798\" (UID: \"e0995bd5-bc3f-4011-b6ca-ee70c4d84798\") " Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.133884 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e0995bd5-bc3f-4011-b6ca-ee70c4d84798-kubelet-dir\") pod \"e0995bd5-bc3f-4011-b6ca-ee70c4d84798\" (UID: \"e0995bd5-bc3f-4011-b6ca-ee70c4d84798\") " Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.134006 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0995bd5-bc3f-4011-b6ca-ee70c4d84798-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e0995bd5-bc3f-4011-b6ca-ee70c4d84798" (UID: "e0995bd5-bc3f-4011-b6ca-ee70c4d84798"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.134171 4791 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e0995bd5-bc3f-4011-b6ca-ee70c4d84798-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.138396 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0995bd5-bc3f-4011-b6ca-ee70c4d84798-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e0995bd5-bc3f-4011-b6ca-ee70c4d84798" (UID: "e0995bd5-bc3f-4011-b6ca-ee70c4d84798"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.237046 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e98ac61-7140-4c15-8c29-47676734a52d-catalog-content\") pod \"9e98ac61-7140-4c15-8c29-47676734a52d\" (UID: \"9e98ac61-7140-4c15-8c29-47676734a52d\") " Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.237214 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e98ac61-7140-4c15-8c29-47676734a52d-utilities\") pod \"9e98ac61-7140-4c15-8c29-47676734a52d\" (UID: \"9e98ac61-7140-4c15-8c29-47676734a52d\") " Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.237665 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6fd4\" (UniqueName: \"kubernetes.io/projected/9e98ac61-7140-4c15-8c29-47676734a52d-kube-api-access-f6fd4\") pod \"9e98ac61-7140-4c15-8c29-47676734a52d\" (UID: \"9e98ac61-7140-4c15-8c29-47676734a52d\") " Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.237847 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e98ac61-7140-4c15-8c29-47676734a52d-utilities" (OuterVolumeSpecName: "utilities") pod "9e98ac61-7140-4c15-8c29-47676734a52d" (UID: "9e98ac61-7140-4c15-8c29-47676734a52d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.240770 4791 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e98ac61-7140-4c15-8c29-47676734a52d-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.240810 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e0995bd5-bc3f-4011-b6ca-ee70c4d84798-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.256480 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e98ac61-7140-4c15-8c29-47676734a52d-kube-api-access-f6fd4" (OuterVolumeSpecName: "kube-api-access-f6fd4") pod "9e98ac61-7140-4c15-8c29-47676734a52d" (UID: "9e98ac61-7140-4c15-8c29-47676734a52d"). InnerVolumeSpecName "kube-api-access-f6fd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.322968 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e98ac61-7140-4c15-8c29-47676734a52d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9e98ac61-7140-4c15-8c29-47676734a52d" (UID: "9e98ac61-7140-4c15-8c29-47676734a52d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.342188 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6fd4\" (UniqueName: \"kubernetes.io/projected/9e98ac61-7140-4c15-8c29-47676734a52d-kube-api-access-f6fd4\") on node \"crc\" DevicePath \"\"" Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.342216 4791 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e98ac61-7140-4c15-8c29-47676734a52d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.713591 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"e0995bd5-bc3f-4011-b6ca-ee70c4d84798","Type":"ContainerDied","Data":"97c89836b71ff7d21a5271a0fc522b64b958612bc0508d95b9f8ef20888e934d"} Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.713650 4791 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97c89836b71ff7d21a5271a0fc522b64b958612bc0508d95b9f8ef20888e934d" Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.713850 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.722337 4791 generic.go:334] "Generic (PLEG): container finished" podID="9e98ac61-7140-4c15-8c29-47676734a52d" containerID="5f1f15c58396e8a0d372d8b6e3158f29e686530d20a25a9aa3576e031d69ab9e" exitCode=0 Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.723225 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b9dcp" Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.723253 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9dcp" event={"ID":"9e98ac61-7140-4c15-8c29-47676734a52d","Type":"ContainerDied","Data":"5f1f15c58396e8a0d372d8b6e3158f29e686530d20a25a9aa3576e031d69ab9e"} Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.723310 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9dcp" event={"ID":"9e98ac61-7140-4c15-8c29-47676734a52d","Type":"ContainerDied","Data":"b50d297ceb2781d37cf93aa6962f7a2c6eaee4f855e23a5380418bef8dad3d16"} Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.723334 4791 scope.go:117] "RemoveContainer" containerID="5f1f15c58396e8a0d372d8b6e3158f29e686530d20a25a9aa3576e031d69ab9e" Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.743024 4791 scope.go:117] "RemoveContainer" containerID="b0d6d012f61c85279eddffc3b1f7adb9dc86b4b4ba7e62c78a6a02431b529533" Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.765815 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bj8pc" Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.768462 4791 scope.go:117] "RemoveContainer" containerID="d006ea87acb43994528a5f501fff92ed303a832231395907903d1e69a1ba44f7" Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.768131 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b9dcp"] Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.770865 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-b9dcp"] Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.786126 4791 scope.go:117] "RemoveContainer" containerID="5f1f15c58396e8a0d372d8b6e3158f29e686530d20a25a9aa3576e031d69ab9e" Feb 17 00:08:55 crc kubenswrapper[4791]: E0217 00:08:55.786436 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f1f15c58396e8a0d372d8b6e3158f29e686530d20a25a9aa3576e031d69ab9e\": container with ID starting with 5f1f15c58396e8a0d372d8b6e3158f29e686530d20a25a9aa3576e031d69ab9e not found: ID does not exist" containerID="5f1f15c58396e8a0d372d8b6e3158f29e686530d20a25a9aa3576e031d69ab9e" Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.786467 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f1f15c58396e8a0d372d8b6e3158f29e686530d20a25a9aa3576e031d69ab9e"} err="failed to get container status \"5f1f15c58396e8a0d372d8b6e3158f29e686530d20a25a9aa3576e031d69ab9e\": rpc error: code = NotFound desc = could not find container \"5f1f15c58396e8a0d372d8b6e3158f29e686530d20a25a9aa3576e031d69ab9e\": container with ID starting with 5f1f15c58396e8a0d372d8b6e3158f29e686530d20a25a9aa3576e031d69ab9e not found: ID does not exist" Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.786502 4791 scope.go:117] "RemoveContainer" containerID="b0d6d012f61c85279eddffc3b1f7adb9dc86b4b4ba7e62c78a6a02431b529533" Feb 17 00:08:55 crc kubenswrapper[4791]: E0217 00:08:55.787646 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0d6d012f61c85279eddffc3b1f7adb9dc86b4b4ba7e62c78a6a02431b529533\": container with ID starting with b0d6d012f61c85279eddffc3b1f7adb9dc86b4b4ba7e62c78a6a02431b529533 not found: ID does not exist" containerID="b0d6d012f61c85279eddffc3b1f7adb9dc86b4b4ba7e62c78a6a02431b529533" Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.787692 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0d6d012f61c85279eddffc3b1f7adb9dc86b4b4ba7e62c78a6a02431b529533"} err="failed to get container status \"b0d6d012f61c85279eddffc3b1f7adb9dc86b4b4ba7e62c78a6a02431b529533\": rpc error: code = NotFound desc = could not find container \"b0d6d012f61c85279eddffc3b1f7adb9dc86b4b4ba7e62c78a6a02431b529533\": container with ID starting with b0d6d012f61c85279eddffc3b1f7adb9dc86b4b4ba7e62c78a6a02431b529533 not found: ID does not exist" Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.787719 4791 scope.go:117] "RemoveContainer" containerID="d006ea87acb43994528a5f501fff92ed303a832231395907903d1e69a1ba44f7" Feb 17 00:08:55 crc kubenswrapper[4791]: E0217 00:08:55.788161 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d006ea87acb43994528a5f501fff92ed303a832231395907903d1e69a1ba44f7\": container with ID starting with d006ea87acb43994528a5f501fff92ed303a832231395907903d1e69a1ba44f7 not found: ID does not exist" containerID="d006ea87acb43994528a5f501fff92ed303a832231395907903d1e69a1ba44f7" Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.788190 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d006ea87acb43994528a5f501fff92ed303a832231395907903d1e69a1ba44f7"} err="failed to get container status \"d006ea87acb43994528a5f501fff92ed303a832231395907903d1e69a1ba44f7\": rpc error: code = NotFound desc = could not find container \"d006ea87acb43994528a5f501fff92ed303a832231395907903d1e69a1ba44f7\": container with ID starting with d006ea87acb43994528a5f501fff92ed303a832231395907903d1e69a1ba44f7 not found: ID does not exist" Feb 17 00:08:56 crc kubenswrapper[4791]: I0217 00:08:56.632680 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bj8pc"] Feb 17 00:08:57 crc kubenswrapper[4791]: I0217 00:08:57.228184 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e98ac61-7140-4c15-8c29-47676734a52d" path="/var/lib/kubelet/pods/9e98ac61-7140-4c15-8c29-47676734a52d/volumes" Feb 17 00:08:57 crc kubenswrapper[4791]: I0217 00:08:57.479471 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 17 00:08:57 crc kubenswrapper[4791]: E0217 00:08:57.479679 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e98ac61-7140-4c15-8c29-47676734a52d" containerName="extract-content" Feb 17 00:08:57 crc kubenswrapper[4791]: I0217 00:08:57.479689 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e98ac61-7140-4c15-8c29-47676734a52d" containerName="extract-content" Feb 17 00:08:57 crc kubenswrapper[4791]: E0217 00:08:57.479699 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e98ac61-7140-4c15-8c29-47676734a52d" containerName="registry-server" Feb 17 00:08:57 crc kubenswrapper[4791]: I0217 00:08:57.479704 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e98ac61-7140-4c15-8c29-47676734a52d" containerName="registry-server" Feb 17 00:08:57 crc kubenswrapper[4791]: E0217 00:08:57.479719 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e98ac61-7140-4c15-8c29-47676734a52d" containerName="extract-utilities" Feb 17 00:08:57 crc kubenswrapper[4791]: I0217 00:08:57.479726 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e98ac61-7140-4c15-8c29-47676734a52d" containerName="extract-utilities" Feb 17 00:08:57 crc kubenswrapper[4791]: E0217 00:08:57.479736 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0995bd5-bc3f-4011-b6ca-ee70c4d84798" containerName="pruner" Feb 17 00:08:57 crc kubenswrapper[4791]: I0217 00:08:57.479742 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0995bd5-bc3f-4011-b6ca-ee70c4d84798" containerName="pruner" Feb 17 00:08:57 crc kubenswrapper[4791]: I0217 00:08:57.479887 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e98ac61-7140-4c15-8c29-47676734a52d" containerName="registry-server" Feb 17 00:08:57 crc kubenswrapper[4791]: I0217 00:08:57.479902 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0995bd5-bc3f-4011-b6ca-ee70c4d84798" containerName="pruner" Feb 17 00:08:57 crc kubenswrapper[4791]: I0217 00:08:57.480263 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 17 00:08:57 crc kubenswrapper[4791]: I0217 00:08:57.482696 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 17 00:08:57 crc kubenswrapper[4791]: I0217 00:08:57.482967 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 17 00:08:57 crc kubenswrapper[4791]: I0217 00:08:57.488668 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 17 00:08:57 crc kubenswrapper[4791]: I0217 00:08:57.675448 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/45d65fd1-6366-4ed0-bc40-10d5418435ea-kube-api-access\") pod \"installer-9-crc\" (UID: \"45d65fd1-6366-4ed0-bc40-10d5418435ea\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 00:08:57 crc kubenswrapper[4791]: I0217 00:08:57.675489 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/45d65fd1-6366-4ed0-bc40-10d5418435ea-var-lock\") pod \"installer-9-crc\" (UID: \"45d65fd1-6366-4ed0-bc40-10d5418435ea\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 00:08:57 crc kubenswrapper[4791]: I0217 00:08:57.675513 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/45d65fd1-6366-4ed0-bc40-10d5418435ea-kubelet-dir\") pod \"installer-9-crc\" (UID: \"45d65fd1-6366-4ed0-bc40-10d5418435ea\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 00:08:57 crc kubenswrapper[4791]: I0217 00:08:57.733439 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bj8pc" podUID="fc437e64-8eee-418b-83d2-f79578cec0fe" containerName="registry-server" containerID="cri-o://d649ea334bf454488556fe2c70b06ff12d86f88ff25ffa7b4f00f581346ea820" gracePeriod=2 Feb 17 00:08:57 crc kubenswrapper[4791]: I0217 00:08:57.776336 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/45d65fd1-6366-4ed0-bc40-10d5418435ea-kube-api-access\") pod \"installer-9-crc\" (UID: \"45d65fd1-6366-4ed0-bc40-10d5418435ea\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 00:08:57 crc kubenswrapper[4791]: I0217 00:08:57.776392 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/45d65fd1-6366-4ed0-bc40-10d5418435ea-var-lock\") pod \"installer-9-crc\" (UID: \"45d65fd1-6366-4ed0-bc40-10d5418435ea\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 00:08:57 crc kubenswrapper[4791]: I0217 00:08:57.776411 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/45d65fd1-6366-4ed0-bc40-10d5418435ea-kubelet-dir\") pod \"installer-9-crc\" (UID: \"45d65fd1-6366-4ed0-bc40-10d5418435ea\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 00:08:57 crc kubenswrapper[4791]: I0217 00:08:57.776572 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/45d65fd1-6366-4ed0-bc40-10d5418435ea-kubelet-dir\") pod \"installer-9-crc\" (UID: \"45d65fd1-6366-4ed0-bc40-10d5418435ea\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 00:08:57 crc kubenswrapper[4791]: I0217 00:08:57.776573 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/45d65fd1-6366-4ed0-bc40-10d5418435ea-var-lock\") pod \"installer-9-crc\" (UID: \"45d65fd1-6366-4ed0-bc40-10d5418435ea\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 00:08:57 crc kubenswrapper[4791]: I0217 00:08:57.806048 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/45d65fd1-6366-4ed0-bc40-10d5418435ea-kube-api-access\") pod \"installer-9-crc\" (UID: \"45d65fd1-6366-4ed0-bc40-10d5418435ea\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 00:08:57 crc kubenswrapper[4791]: I0217 00:08:57.809248 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 17 00:08:58 crc kubenswrapper[4791]: I0217 00:08:58.740459 4791 generic.go:334] "Generic (PLEG): container finished" podID="fc437e64-8eee-418b-83d2-f79578cec0fe" containerID="d649ea334bf454488556fe2c70b06ff12d86f88ff25ffa7b4f00f581346ea820" exitCode=0 Feb 17 00:08:58 crc kubenswrapper[4791]: I0217 00:08:58.740526 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bj8pc" event={"ID":"fc437e64-8eee-418b-83d2-f79578cec0fe","Type":"ContainerDied","Data":"d649ea334bf454488556fe2c70b06ff12d86f88ff25ffa7b4f00f581346ea820"} Feb 17 00:08:59 crc kubenswrapper[4791]: I0217 00:08:59.281948 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bj8pc" Feb 17 00:08:59 crc kubenswrapper[4791]: I0217 00:08:59.299515 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 17 00:08:59 crc kubenswrapper[4791]: W0217 00:08:59.310538 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod45d65fd1_6366_4ed0_bc40_10d5418435ea.slice/crio-e34b1921e9f7cc6a8f4baa20b25186c7eb9d2fa742e706dd1bd3c235046f30fd WatchSource:0}: Error finding container e34b1921e9f7cc6a8f4baa20b25186c7eb9d2fa742e706dd1bd3c235046f30fd: Status 404 returned error can't find the container with id e34b1921e9f7cc6a8f4baa20b25186c7eb9d2fa742e706dd1bd3c235046f30fd Feb 17 00:08:59 crc kubenswrapper[4791]: I0217 00:08:59.411406 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc437e64-8eee-418b-83d2-f79578cec0fe-catalog-content\") pod \"fc437e64-8eee-418b-83d2-f79578cec0fe\" (UID: \"fc437e64-8eee-418b-83d2-f79578cec0fe\") " Feb 17 00:08:59 crc kubenswrapper[4791]: I0217 00:08:59.411759 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngk4h\" (UniqueName: \"kubernetes.io/projected/fc437e64-8eee-418b-83d2-f79578cec0fe-kube-api-access-ngk4h\") pod \"fc437e64-8eee-418b-83d2-f79578cec0fe\" (UID: \"fc437e64-8eee-418b-83d2-f79578cec0fe\") " Feb 17 00:08:59 crc kubenswrapper[4791]: I0217 00:08:59.411811 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc437e64-8eee-418b-83d2-f79578cec0fe-utilities\") pod \"fc437e64-8eee-418b-83d2-f79578cec0fe\" (UID: \"fc437e64-8eee-418b-83d2-f79578cec0fe\") " Feb 17 00:08:59 crc kubenswrapper[4791]: I0217 00:08:59.412636 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc437e64-8eee-418b-83d2-f79578cec0fe-utilities" (OuterVolumeSpecName: "utilities") pod "fc437e64-8eee-418b-83d2-f79578cec0fe" (UID: "fc437e64-8eee-418b-83d2-f79578cec0fe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:08:59 crc kubenswrapper[4791]: I0217 00:08:59.416900 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc437e64-8eee-418b-83d2-f79578cec0fe-kube-api-access-ngk4h" (OuterVolumeSpecName: "kube-api-access-ngk4h") pod "fc437e64-8eee-418b-83d2-f79578cec0fe" (UID: "fc437e64-8eee-418b-83d2-f79578cec0fe"). InnerVolumeSpecName "kube-api-access-ngk4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:08:59 crc kubenswrapper[4791]: I0217 00:08:59.513948 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngk4h\" (UniqueName: \"kubernetes.io/projected/fc437e64-8eee-418b-83d2-f79578cec0fe-kube-api-access-ngk4h\") on node \"crc\" DevicePath \"\"" Feb 17 00:08:59 crc kubenswrapper[4791]: I0217 00:08:59.513978 4791 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc437e64-8eee-418b-83d2-f79578cec0fe-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 00:08:59 crc kubenswrapper[4791]: I0217 00:08:59.533647 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc437e64-8eee-418b-83d2-f79578cec0fe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc437e64-8eee-418b-83d2-f79578cec0fe" (UID: "fc437e64-8eee-418b-83d2-f79578cec0fe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:08:59 crc kubenswrapper[4791]: I0217 00:08:59.615003 4791 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc437e64-8eee-418b-83d2-f79578cec0fe-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 00:08:59 crc kubenswrapper[4791]: I0217 00:08:59.749182 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bj8pc" event={"ID":"fc437e64-8eee-418b-83d2-f79578cec0fe","Type":"ContainerDied","Data":"66dcd55652a7b7d244cc261ad14bf94f214d23f47f292c66da7aede063b2653b"} Feb 17 00:08:59 crc kubenswrapper[4791]: I0217 00:08:59.749204 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bj8pc" Feb 17 00:08:59 crc kubenswrapper[4791]: I0217 00:08:59.749246 4791 scope.go:117] "RemoveContainer" containerID="d649ea334bf454488556fe2c70b06ff12d86f88ff25ffa7b4f00f581346ea820" Feb 17 00:08:59 crc kubenswrapper[4791]: I0217 00:08:59.754649 4791 generic.go:334] "Generic (PLEG): container finished" podID="48855520-658c-4579-a867-7e984bce56c7" containerID="0adb68d60017352d76c2f5cbf51ce6821a2b97f8be3e0ed37b97bdded9fc7e61" exitCode=0 Feb 17 00:08:59 crc kubenswrapper[4791]: I0217 00:08:59.754675 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cgmd4" event={"ID":"48855520-658c-4579-a867-7e984bce56c7","Type":"ContainerDied","Data":"0adb68d60017352d76c2f5cbf51ce6821a2b97f8be3e0ed37b97bdded9fc7e61"} Feb 17 00:08:59 crc kubenswrapper[4791]: I0217 00:08:59.757299 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"45d65fd1-6366-4ed0-bc40-10d5418435ea","Type":"ContainerStarted","Data":"625e883c4175a7bb832ef122a4de927b44b76c4d7352658a36f562332f5bdbaa"} Feb 17 00:08:59 crc kubenswrapper[4791]: I0217 00:08:59.757354 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"45d65fd1-6366-4ed0-bc40-10d5418435ea","Type":"ContainerStarted","Data":"e34b1921e9f7cc6a8f4baa20b25186c7eb9d2fa742e706dd1bd3c235046f30fd"} Feb 17 00:08:59 crc kubenswrapper[4791]: I0217 00:08:59.768276 4791 scope.go:117] "RemoveContainer" containerID="08a68b5f0e7fdf02d507ba4fd7e4b03827c1f9b663dd4ab621eef2f40ab4ac2d" Feb 17 00:08:59 crc kubenswrapper[4791]: I0217 00:08:59.839778 4791 scope.go:117] "RemoveContainer" containerID="4c3e1e8da3848cbfda653b16270111f1411ce1e227ddf3a4156d7065874d5fdb" Feb 17 00:08:59 crc kubenswrapper[4791]: I0217 00:08:59.849748 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.8495815220000003 podStartE2EDuration="2.849581522s" podCreationTimestamp="2026-02-17 00:08:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:59.846327387 +0000 UTC m=+197.325839924" watchObservedRunningTime="2026-02-17 00:08:59.849581522 +0000 UTC m=+197.329094049" Feb 17 00:08:59 crc kubenswrapper[4791]: I0217 00:08:59.866009 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bj8pc"] Feb 17 00:08:59 crc kubenswrapper[4791]: I0217 00:08:59.870276 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bj8pc"] Feb 17 00:09:00 crc kubenswrapper[4791]: I0217 00:09:00.764640 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cgmd4" event={"ID":"48855520-658c-4579-a867-7e984bce56c7","Type":"ContainerStarted","Data":"5c2e0aff4866650debb5175265a6d01c2a0e5b55e721f15c505fa68b2701ca01"} Feb 17 00:09:00 crc kubenswrapper[4791]: I0217 00:09:00.767305 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tt7zf" event={"ID":"9895217e-9934-4f80-a583-98842d597690","Type":"ContainerStarted","Data":"50fcf4873c5a6dd6daf12f726d1b182ebcd3d6006fd89bb5ee76ec514fb8ad9a"} Feb 17 00:09:00 crc kubenswrapper[4791]: I0217 00:09:00.781719 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cgmd4" podStartSLOduration=3.73993951 podStartE2EDuration="50.781701661s" podCreationTimestamp="2026-02-17 00:08:10 +0000 UTC" firstStartedPulling="2026-02-17 00:08:13.212922256 +0000 UTC m=+150.692434793" lastFinishedPulling="2026-02-17 00:09:00.254684417 +0000 UTC m=+197.734196944" observedRunningTime="2026-02-17 00:09:00.780660499 +0000 UTC m=+198.260173026" watchObservedRunningTime="2026-02-17 00:09:00.781701661 +0000 UTC m=+198.261214188" Feb 17 00:09:01 crc kubenswrapper[4791]: I0217 00:09:01.209495 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cgmd4" Feb 17 00:09:01 crc kubenswrapper[4791]: I0217 00:09:01.209551 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cgmd4" Feb 17 00:09:01 crc kubenswrapper[4791]: I0217 00:09:01.226209 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc437e64-8eee-418b-83d2-f79578cec0fe" path="/var/lib/kubelet/pods/fc437e64-8eee-418b-83d2-f79578cec0fe/volumes" Feb 17 00:09:01 crc kubenswrapper[4791]: I0217 00:09:01.773460 4791 generic.go:334] "Generic (PLEG): container finished" podID="9895217e-9934-4f80-a583-98842d597690" containerID="50fcf4873c5a6dd6daf12f726d1b182ebcd3d6006fd89bb5ee76ec514fb8ad9a" exitCode=0 Feb 17 00:09:01 crc kubenswrapper[4791]: I0217 00:09:01.773527 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tt7zf" event={"ID":"9895217e-9934-4f80-a583-98842d597690","Type":"ContainerDied","Data":"50fcf4873c5a6dd6daf12f726d1b182ebcd3d6006fd89bb5ee76ec514fb8ad9a"} Feb 17 00:09:01 crc kubenswrapper[4791]: I0217 00:09:01.775694 4791 generic.go:334] "Generic (PLEG): container finished" podID="509fdefa-10b0-4752-b844-843ed9e7106d" containerID="5706adc95f93987fa96bb222e121f9a9ec4e44f5803df7b257f2595f2a56b0d7" exitCode=0 Feb 17 00:09:01 crc kubenswrapper[4791]: I0217 00:09:01.775743 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-twq6q" event={"ID":"509fdefa-10b0-4752-b844-843ed9e7106d","Type":"ContainerDied","Data":"5706adc95f93987fa96bb222e121f9a9ec4e44f5803df7b257f2595f2a56b0d7"} Feb 17 00:09:01 crc kubenswrapper[4791]: I0217 00:09:01.780280 4791 generic.go:334] "Generic (PLEG): container finished" podID="db1caaaf-7e8b-405c-97ff-7c507f068688" containerID="517005f22a3a08ad5b1e6c06b10a655495fa9186c8d9598bb0d8aad8f87d1943" exitCode=0 Feb 17 00:09:01 crc kubenswrapper[4791]: I0217 00:09:01.780353 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h66xr" event={"ID":"db1caaaf-7e8b-405c-97ff-7c507f068688","Type":"ContainerDied","Data":"517005f22a3a08ad5b1e6c06b10a655495fa9186c8d9598bb0d8aad8f87d1943"} Feb 17 00:09:02 crc kubenswrapper[4791]: I0217 00:09:02.256232 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-cgmd4" podUID="48855520-658c-4579-a867-7e984bce56c7" containerName="registry-server" probeResult="failure" output=< Feb 17 00:09:02 crc kubenswrapper[4791]: timeout: failed to connect service ":50051" within 1s Feb 17 00:09:02 crc kubenswrapper[4791]: > Feb 17 00:09:02 crc kubenswrapper[4791]: I0217 00:09:02.787973 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tt7zf" event={"ID":"9895217e-9934-4f80-a583-98842d597690","Type":"ContainerStarted","Data":"689a72338c06b9eae09991f0631b50e3df437c11c9c31b6df3759a41093c5034"} Feb 17 00:09:02 crc kubenswrapper[4791]: I0217 00:09:02.791028 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-twq6q" event={"ID":"509fdefa-10b0-4752-b844-843ed9e7106d","Type":"ContainerStarted","Data":"ea65bc2d61ac85235a80ae8a27573effd372e25997a3f5303b609799ecc6b2f3"} Feb 17 00:09:02 crc kubenswrapper[4791]: I0217 00:09:02.793529 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h66xr" event={"ID":"db1caaaf-7e8b-405c-97ff-7c507f068688","Type":"ContainerStarted","Data":"5d4c2a7bdc132569574b78294e0819b4803e57c50fba23d63fb08d7fe94edb8a"} Feb 17 00:09:02 crc kubenswrapper[4791]: I0217 00:09:02.810453 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tt7zf" podStartSLOduration=2.77939956 podStartE2EDuration="51.810442471s" podCreationTimestamp="2026-02-17 00:08:11 +0000 UTC" firstStartedPulling="2026-02-17 00:08:13.214274039 +0000 UTC m=+150.693786566" lastFinishedPulling="2026-02-17 00:09:02.24531695 +0000 UTC m=+199.724829477" observedRunningTime="2026-02-17 00:09:02.808714935 +0000 UTC m=+200.288227462" watchObservedRunningTime="2026-02-17 00:09:02.810442471 +0000 UTC m=+200.289954998" Feb 17 00:09:02 crc kubenswrapper[4791]: I0217 00:09:02.822548 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-twq6q" podStartSLOduration=2.631955784 podStartE2EDuration="49.822529159s" podCreationTimestamp="2026-02-17 00:08:13 +0000 UTC" firstStartedPulling="2026-02-17 00:08:15.313141033 +0000 UTC m=+152.792653560" lastFinishedPulling="2026-02-17 00:09:02.503714408 +0000 UTC m=+199.983226935" observedRunningTime="2026-02-17 00:09:02.821226467 +0000 UTC m=+200.300739004" watchObservedRunningTime="2026-02-17 00:09:02.822529159 +0000 UTC m=+200.302041686" Feb 17 00:09:02 crc kubenswrapper[4791]: I0217 00:09:02.844496 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-h66xr" podStartSLOduration=1.6676257159999999 podStartE2EDuration="49.844478044s" podCreationTimestamp="2026-02-17 00:08:13 +0000 UTC" firstStartedPulling="2026-02-17 00:08:14.247102876 +0000 UTC m=+151.726615393" lastFinishedPulling="2026-02-17 00:09:02.423955194 +0000 UTC m=+199.903467721" observedRunningTime="2026-02-17 00:09:02.841302353 +0000 UTC m=+200.320814880" watchObservedRunningTime="2026-02-17 00:09:02.844478044 +0000 UTC m=+200.323990581" Feb 17 00:09:03 crc kubenswrapper[4791]: I0217 00:09:03.355858 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-h66xr" Feb 17 00:09:03 crc kubenswrapper[4791]: I0217 00:09:03.355909 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-h66xr" Feb 17 00:09:03 crc kubenswrapper[4791]: I0217 00:09:03.847956 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-twq6q" Feb 17 00:09:03 crc kubenswrapper[4791]: I0217 00:09:03.848034 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-twq6q" Feb 17 00:09:04 crc kubenswrapper[4791]: I0217 00:09:04.399785 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-h66xr" podUID="db1caaaf-7e8b-405c-97ff-7c507f068688" containerName="registry-server" probeResult="failure" output=< Feb 17 00:09:04 crc kubenswrapper[4791]: timeout: failed to connect service ":50051" within 1s Feb 17 00:09:04 crc kubenswrapper[4791]: > Feb 17 00:09:04 crc kubenswrapper[4791]: I0217 00:09:04.893156 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-twq6q" podUID="509fdefa-10b0-4752-b844-843ed9e7106d" containerName="registry-server" probeResult="failure" output=< Feb 17 00:09:04 crc kubenswrapper[4791]: timeout: failed to connect service ":50051" within 1s Feb 17 00:09:04 crc kubenswrapper[4791]: > Feb 17 00:09:11 crc kubenswrapper[4791]: I0217 00:09:11.258329 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cgmd4" Feb 17 00:09:11 crc kubenswrapper[4791]: I0217 00:09:11.307618 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cgmd4" Feb 17 00:09:11 crc kubenswrapper[4791]: I0217 00:09:11.554170 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tt7zf" Feb 17 00:09:11 crc kubenswrapper[4791]: I0217 00:09:11.554218 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tt7zf" Feb 17 00:09:11 crc kubenswrapper[4791]: I0217 00:09:11.601495 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tt7zf" Feb 17 00:09:11 crc kubenswrapper[4791]: I0217 00:09:11.916045 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tt7zf" Feb 17 00:09:13 crc kubenswrapper[4791]: I0217 00:09:13.406156 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-h66xr" Feb 17 00:09:13 crc kubenswrapper[4791]: I0217 00:09:13.452310 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-h66xr" Feb 17 00:09:13 crc kubenswrapper[4791]: I0217 00:09:13.459383 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tt7zf"] Feb 17 00:09:13 crc kubenswrapper[4791]: I0217 00:09:13.862859 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tt7zf" podUID="9895217e-9934-4f80-a583-98842d597690" containerName="registry-server" containerID="cri-o://689a72338c06b9eae09991f0631b50e3df437c11c9c31b6df3759a41093c5034" gracePeriod=2 Feb 17 00:09:13 crc kubenswrapper[4791]: I0217 00:09:13.889819 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-twq6q" Feb 17 00:09:13 crc kubenswrapper[4791]: I0217 00:09:13.925375 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-twq6q" Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.231556 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tt7zf" Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.345049 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9895217e-9934-4f80-a583-98842d597690-utilities\") pod \"9895217e-9934-4f80-a583-98842d597690\" (UID: \"9895217e-9934-4f80-a583-98842d597690\") " Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.345093 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndp5p\" (UniqueName: \"kubernetes.io/projected/9895217e-9934-4f80-a583-98842d597690-kube-api-access-ndp5p\") pod \"9895217e-9934-4f80-a583-98842d597690\" (UID: \"9895217e-9934-4f80-a583-98842d597690\") " Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.345179 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9895217e-9934-4f80-a583-98842d597690-catalog-content\") pod \"9895217e-9934-4f80-a583-98842d597690\" (UID: \"9895217e-9934-4f80-a583-98842d597690\") " Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.347480 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9895217e-9934-4f80-a583-98842d597690-utilities" (OuterVolumeSpecName: "utilities") pod "9895217e-9934-4f80-a583-98842d597690" (UID: "9895217e-9934-4f80-a583-98842d597690"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.354634 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9895217e-9934-4f80-a583-98842d597690-kube-api-access-ndp5p" (OuterVolumeSpecName: "kube-api-access-ndp5p") pod "9895217e-9934-4f80-a583-98842d597690" (UID: "9895217e-9934-4f80-a583-98842d597690"). InnerVolumeSpecName "kube-api-access-ndp5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.413857 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9895217e-9934-4f80-a583-98842d597690-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9895217e-9934-4f80-a583-98842d597690" (UID: "9895217e-9934-4f80-a583-98842d597690"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.446935 4791 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9895217e-9934-4f80-a583-98842d597690-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.446972 4791 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9895217e-9934-4f80-a583-98842d597690-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.446983 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndp5p\" (UniqueName: \"kubernetes.io/projected/9895217e-9934-4f80-a583-98842d597690-kube-api-access-ndp5p\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.873233 4791 generic.go:334] "Generic (PLEG): container finished" podID="9895217e-9934-4f80-a583-98842d597690" containerID="689a72338c06b9eae09991f0631b50e3df437c11c9c31b6df3759a41093c5034" exitCode=0 Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.873371 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tt7zf" Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.873390 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tt7zf" event={"ID":"9895217e-9934-4f80-a583-98842d597690","Type":"ContainerDied","Data":"689a72338c06b9eae09991f0631b50e3df437c11c9c31b6df3759a41093c5034"} Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.873868 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tt7zf" event={"ID":"9895217e-9934-4f80-a583-98842d597690","Type":"ContainerDied","Data":"977ab4aa20ed7b60a4814c6d6489439a907a290d464af852a9ad4af4039d6953"} Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.873899 4791 scope.go:117] "RemoveContainer" containerID="689a72338c06b9eae09991f0631b50e3df437c11c9c31b6df3759a41093c5034" Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.897390 4791 scope.go:117] "RemoveContainer" containerID="50fcf4873c5a6dd6daf12f726d1b182ebcd3d6006fd89bb5ee76ec514fb8ad9a" Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.911523 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tt7zf"] Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.918062 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tt7zf"] Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.936244 4791 scope.go:117] "RemoveContainer" containerID="b36899b8f5eaec69b28a571b963ca26fa0d5f68899482c023912c3245da10e33" Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.986150 4791 scope.go:117] "RemoveContainer" containerID="689a72338c06b9eae09991f0631b50e3df437c11c9c31b6df3759a41093c5034" Feb 17 00:09:14 crc kubenswrapper[4791]: E0217 00:09:14.986644 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"689a72338c06b9eae09991f0631b50e3df437c11c9c31b6df3759a41093c5034\": container with ID starting with 689a72338c06b9eae09991f0631b50e3df437c11c9c31b6df3759a41093c5034 not found: ID does not exist" containerID="689a72338c06b9eae09991f0631b50e3df437c11c9c31b6df3759a41093c5034" Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.986682 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"689a72338c06b9eae09991f0631b50e3df437c11c9c31b6df3759a41093c5034"} err="failed to get container status \"689a72338c06b9eae09991f0631b50e3df437c11c9c31b6df3759a41093c5034\": rpc error: code = NotFound desc = could not find container \"689a72338c06b9eae09991f0631b50e3df437c11c9c31b6df3759a41093c5034\": container with ID starting with 689a72338c06b9eae09991f0631b50e3df437c11c9c31b6df3759a41093c5034 not found: ID does not exist" Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.986708 4791 scope.go:117] "RemoveContainer" containerID="50fcf4873c5a6dd6daf12f726d1b182ebcd3d6006fd89bb5ee76ec514fb8ad9a" Feb 17 00:09:14 crc kubenswrapper[4791]: E0217 00:09:14.987168 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50fcf4873c5a6dd6daf12f726d1b182ebcd3d6006fd89bb5ee76ec514fb8ad9a\": container with ID starting with 50fcf4873c5a6dd6daf12f726d1b182ebcd3d6006fd89bb5ee76ec514fb8ad9a not found: ID does not exist" containerID="50fcf4873c5a6dd6daf12f726d1b182ebcd3d6006fd89bb5ee76ec514fb8ad9a" Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.987221 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50fcf4873c5a6dd6daf12f726d1b182ebcd3d6006fd89bb5ee76ec514fb8ad9a"} err="failed to get container status \"50fcf4873c5a6dd6daf12f726d1b182ebcd3d6006fd89bb5ee76ec514fb8ad9a\": rpc error: code = NotFound desc = could not find container \"50fcf4873c5a6dd6daf12f726d1b182ebcd3d6006fd89bb5ee76ec514fb8ad9a\": container with ID starting with 50fcf4873c5a6dd6daf12f726d1b182ebcd3d6006fd89bb5ee76ec514fb8ad9a not found: ID does not exist" Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.987260 4791 scope.go:117] "RemoveContainer" containerID="b36899b8f5eaec69b28a571b963ca26fa0d5f68899482c023912c3245da10e33" Feb 17 00:09:14 crc kubenswrapper[4791]: E0217 00:09:14.987655 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b36899b8f5eaec69b28a571b963ca26fa0d5f68899482c023912c3245da10e33\": container with ID starting with b36899b8f5eaec69b28a571b963ca26fa0d5f68899482c023912c3245da10e33 not found: ID does not exist" containerID="b36899b8f5eaec69b28a571b963ca26fa0d5f68899482c023912c3245da10e33" Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.987721 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b36899b8f5eaec69b28a571b963ca26fa0d5f68899482c023912c3245da10e33"} err="failed to get container status \"b36899b8f5eaec69b28a571b963ca26fa0d5f68899482c023912c3245da10e33\": rpc error: code = NotFound desc = could not find container \"b36899b8f5eaec69b28a571b963ca26fa0d5f68899482c023912c3245da10e33\": container with ID starting with b36899b8f5eaec69b28a571b963ca26fa0d5f68899482c023912c3245da10e33 not found: ID does not exist" Feb 17 00:09:15 crc kubenswrapper[4791]: I0217 00:09:15.227507 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9895217e-9934-4f80-a583-98842d597690" path="/var/lib/kubelet/pods/9895217e-9934-4f80-a583-98842d597690/volumes" Feb 17 00:09:15 crc kubenswrapper[4791]: I0217 00:09:15.663350 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-twq6q"] Feb 17 00:09:15 crc kubenswrapper[4791]: I0217 00:09:15.882688 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-twq6q" podUID="509fdefa-10b0-4752-b844-843ed9e7106d" containerName="registry-server" containerID="cri-o://ea65bc2d61ac85235a80ae8a27573effd372e25997a3f5303b609799ecc6b2f3" gracePeriod=2 Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.384550 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-twq6q" Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.472391 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/509fdefa-10b0-4752-b844-843ed9e7106d-utilities\") pod \"509fdefa-10b0-4752-b844-843ed9e7106d\" (UID: \"509fdefa-10b0-4752-b844-843ed9e7106d\") " Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.472519 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/509fdefa-10b0-4752-b844-843ed9e7106d-catalog-content\") pod \"509fdefa-10b0-4752-b844-843ed9e7106d\" (UID: \"509fdefa-10b0-4752-b844-843ed9e7106d\") " Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.472572 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hwtv\" (UniqueName: \"kubernetes.io/projected/509fdefa-10b0-4752-b844-843ed9e7106d-kube-api-access-8hwtv\") pod \"509fdefa-10b0-4752-b844-843ed9e7106d\" (UID: \"509fdefa-10b0-4752-b844-843ed9e7106d\") " Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.473500 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/509fdefa-10b0-4752-b844-843ed9e7106d-utilities" (OuterVolumeSpecName: "utilities") pod "509fdefa-10b0-4752-b844-843ed9e7106d" (UID: "509fdefa-10b0-4752-b844-843ed9e7106d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.479411 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/509fdefa-10b0-4752-b844-843ed9e7106d-kube-api-access-8hwtv" (OuterVolumeSpecName: "kube-api-access-8hwtv") pod "509fdefa-10b0-4752-b844-843ed9e7106d" (UID: "509fdefa-10b0-4752-b844-843ed9e7106d"). InnerVolumeSpecName "kube-api-access-8hwtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.523313 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/509fdefa-10b0-4752-b844-843ed9e7106d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "509fdefa-10b0-4752-b844-843ed9e7106d" (UID: "509fdefa-10b0-4752-b844-843ed9e7106d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.574315 4791 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/509fdefa-10b0-4752-b844-843ed9e7106d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.574374 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hwtv\" (UniqueName: \"kubernetes.io/projected/509fdefa-10b0-4752-b844-843ed9e7106d-kube-api-access-8hwtv\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.574394 4791 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/509fdefa-10b0-4752-b844-843ed9e7106d-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.892062 4791 generic.go:334] "Generic (PLEG): container finished" podID="509fdefa-10b0-4752-b844-843ed9e7106d" containerID="ea65bc2d61ac85235a80ae8a27573effd372e25997a3f5303b609799ecc6b2f3" exitCode=0 Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.892143 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-twq6q" event={"ID":"509fdefa-10b0-4752-b844-843ed9e7106d","Type":"ContainerDied","Data":"ea65bc2d61ac85235a80ae8a27573effd372e25997a3f5303b609799ecc6b2f3"} Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.892185 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-twq6q" Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.892207 4791 scope.go:117] "RemoveContainer" containerID="ea65bc2d61ac85235a80ae8a27573effd372e25997a3f5303b609799ecc6b2f3" Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.892191 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-twq6q" event={"ID":"509fdefa-10b0-4752-b844-843ed9e7106d","Type":"ContainerDied","Data":"10bc712141e232e5d7064d897763d2f2a15a17f24973eb264bc82a8e5f9eb39a"} Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.919695 4791 scope.go:117] "RemoveContainer" containerID="5706adc95f93987fa96bb222e121f9a9ec4e44f5803df7b257f2595f2a56b0d7" Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.948210 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-twq6q"] Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.953289 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-twq6q"] Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.964016 4791 scope.go:117] "RemoveContainer" containerID="df626c649c84f3f24620a23cf5b0e1d66e1248e9b1655b34e6e8054df1deef98" Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.982869 4791 scope.go:117] "RemoveContainer" containerID="ea65bc2d61ac85235a80ae8a27573effd372e25997a3f5303b609799ecc6b2f3" Feb 17 00:09:16 crc kubenswrapper[4791]: E0217 00:09:16.983577 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea65bc2d61ac85235a80ae8a27573effd372e25997a3f5303b609799ecc6b2f3\": container with ID starting with ea65bc2d61ac85235a80ae8a27573effd372e25997a3f5303b609799ecc6b2f3 not found: ID does not exist" containerID="ea65bc2d61ac85235a80ae8a27573effd372e25997a3f5303b609799ecc6b2f3" Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.983783 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea65bc2d61ac85235a80ae8a27573effd372e25997a3f5303b609799ecc6b2f3"} err="failed to get container status \"ea65bc2d61ac85235a80ae8a27573effd372e25997a3f5303b609799ecc6b2f3\": rpc error: code = NotFound desc = could not find container \"ea65bc2d61ac85235a80ae8a27573effd372e25997a3f5303b609799ecc6b2f3\": container with ID starting with ea65bc2d61ac85235a80ae8a27573effd372e25997a3f5303b609799ecc6b2f3 not found: ID does not exist" Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.983925 4791 scope.go:117] "RemoveContainer" containerID="5706adc95f93987fa96bb222e121f9a9ec4e44f5803df7b257f2595f2a56b0d7" Feb 17 00:09:16 crc kubenswrapper[4791]: E0217 00:09:16.984552 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5706adc95f93987fa96bb222e121f9a9ec4e44f5803df7b257f2595f2a56b0d7\": container with ID starting with 5706adc95f93987fa96bb222e121f9a9ec4e44f5803df7b257f2595f2a56b0d7 not found: ID does not exist" containerID="5706adc95f93987fa96bb222e121f9a9ec4e44f5803df7b257f2595f2a56b0d7" Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.984617 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5706adc95f93987fa96bb222e121f9a9ec4e44f5803df7b257f2595f2a56b0d7"} err="failed to get container status \"5706adc95f93987fa96bb222e121f9a9ec4e44f5803df7b257f2595f2a56b0d7\": rpc error: code = NotFound desc = could not find container \"5706adc95f93987fa96bb222e121f9a9ec4e44f5803df7b257f2595f2a56b0d7\": container with ID starting with 5706adc95f93987fa96bb222e121f9a9ec4e44f5803df7b257f2595f2a56b0d7 not found: ID does not exist" Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.984659 4791 scope.go:117] "RemoveContainer" containerID="df626c649c84f3f24620a23cf5b0e1d66e1248e9b1655b34e6e8054df1deef98" Feb 17 00:09:16 crc kubenswrapper[4791]: E0217 00:09:16.985122 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df626c649c84f3f24620a23cf5b0e1d66e1248e9b1655b34e6e8054df1deef98\": container with ID starting with df626c649c84f3f24620a23cf5b0e1d66e1248e9b1655b34e6e8054df1deef98 not found: ID does not exist" containerID="df626c649c84f3f24620a23cf5b0e1d66e1248e9b1655b34e6e8054df1deef98" Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.985276 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df626c649c84f3f24620a23cf5b0e1d66e1248e9b1655b34e6e8054df1deef98"} err="failed to get container status \"df626c649c84f3f24620a23cf5b0e1d66e1248e9b1655b34e6e8054df1deef98\": rpc error: code = NotFound desc = could not find container \"df626c649c84f3f24620a23cf5b0e1d66e1248e9b1655b34e6e8054df1deef98\": container with ID starting with df626c649c84f3f24620a23cf5b0e1d66e1248e9b1655b34e6e8054df1deef98 not found: ID does not exist" Feb 17 00:09:17 crc kubenswrapper[4791]: I0217 00:09:17.231287 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="509fdefa-10b0-4752-b844-843ed9e7106d" path="/var/lib/kubelet/pods/509fdefa-10b0-4752-b844-843ed9e7106d/volumes" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.185930 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" podUID="c70fe9d3-348d-4bb8-89f7-21027041131a" containerName="oauth-openshift" containerID="cri-o://8f95a285c9f42fc1af78c5ca2e9c6d3afa217abceec616519705df06ca48c8b7" gracePeriod=15 Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.523752 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.705090 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ws66n\" (UniqueName: \"kubernetes.io/projected/c70fe9d3-348d-4bb8-89f7-21027041131a-kube-api-access-ws66n\") pod \"c70fe9d3-348d-4bb8-89f7-21027041131a\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.705149 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-trusted-ca-bundle\") pod \"c70fe9d3-348d-4bb8-89f7-21027041131a\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.705194 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-session\") pod \"c70fe9d3-348d-4bb8-89f7-21027041131a\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.705209 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c70fe9d3-348d-4bb8-89f7-21027041131a-audit-dir\") pod \"c70fe9d3-348d-4bb8-89f7-21027041131a\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.705231 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-ocp-branding-template\") pod \"c70fe9d3-348d-4bb8-89f7-21027041131a\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.705273 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-service-ca\") pod \"c70fe9d3-348d-4bb8-89f7-21027041131a\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.705294 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c70fe9d3-348d-4bb8-89f7-21027041131a-audit-policies\") pod \"c70fe9d3-348d-4bb8-89f7-21027041131a\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.705309 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-user-template-login\") pod \"c70fe9d3-348d-4bb8-89f7-21027041131a\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.705333 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-cliconfig\") pod \"c70fe9d3-348d-4bb8-89f7-21027041131a\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.705354 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-serving-cert\") pod \"c70fe9d3-348d-4bb8-89f7-21027041131a\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.705372 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-user-template-error\") pod \"c70fe9d3-348d-4bb8-89f7-21027041131a\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.705408 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-user-template-provider-selection\") pod \"c70fe9d3-348d-4bb8-89f7-21027041131a\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.705435 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-router-certs\") pod \"c70fe9d3-348d-4bb8-89f7-21027041131a\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.705455 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-user-idp-0-file-data\") pod \"c70fe9d3-348d-4bb8-89f7-21027041131a\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.705926 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "c70fe9d3-348d-4bb8-89f7-21027041131a" (UID: "c70fe9d3-348d-4bb8-89f7-21027041131a"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.706295 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "c70fe9d3-348d-4bb8-89f7-21027041131a" (UID: "c70fe9d3-348d-4bb8-89f7-21027041131a"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.709638 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c70fe9d3-348d-4bb8-89f7-21027041131a-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "c70fe9d3-348d-4bb8-89f7-21027041131a" (UID: "c70fe9d3-348d-4bb8-89f7-21027041131a"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.710021 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "c70fe9d3-348d-4bb8-89f7-21027041131a" (UID: "c70fe9d3-348d-4bb8-89f7-21027041131a"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.710078 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c70fe9d3-348d-4bb8-89f7-21027041131a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "c70fe9d3-348d-4bb8-89f7-21027041131a" (UID: "c70fe9d3-348d-4bb8-89f7-21027041131a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.711806 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "c70fe9d3-348d-4bb8-89f7-21027041131a" (UID: "c70fe9d3-348d-4bb8-89f7-21027041131a"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.712079 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "c70fe9d3-348d-4bb8-89f7-21027041131a" (UID: "c70fe9d3-348d-4bb8-89f7-21027041131a"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.712271 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "c70fe9d3-348d-4bb8-89f7-21027041131a" (UID: "c70fe9d3-348d-4bb8-89f7-21027041131a"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.712439 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "c70fe9d3-348d-4bb8-89f7-21027041131a" (UID: "c70fe9d3-348d-4bb8-89f7-21027041131a"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.712629 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "c70fe9d3-348d-4bb8-89f7-21027041131a" (UID: "c70fe9d3-348d-4bb8-89f7-21027041131a"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.717427 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c70fe9d3-348d-4bb8-89f7-21027041131a-kube-api-access-ws66n" (OuterVolumeSpecName: "kube-api-access-ws66n") pod "c70fe9d3-348d-4bb8-89f7-21027041131a" (UID: "c70fe9d3-348d-4bb8-89f7-21027041131a"). InnerVolumeSpecName "kube-api-access-ws66n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.719861 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "c70fe9d3-348d-4bb8-89f7-21027041131a" (UID: "c70fe9d3-348d-4bb8-89f7-21027041131a"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.720321 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "c70fe9d3-348d-4bb8-89f7-21027041131a" (UID: "c70fe9d3-348d-4bb8-89f7-21027041131a"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.728673 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "c70fe9d3-348d-4bb8-89f7-21027041131a" (UID: "c70fe9d3-348d-4bb8-89f7-21027041131a"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.806634 4791 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.806668 4791 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.806679 4791 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.806689 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ws66n\" (UniqueName: \"kubernetes.io/projected/c70fe9d3-348d-4bb8-89f7-21027041131a-kube-api-access-ws66n\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.806699 4791 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.806707 4791 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.806716 4791 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c70fe9d3-348d-4bb8-89f7-21027041131a-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.806726 4791 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.806736 4791 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.806745 4791 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c70fe9d3-348d-4bb8-89f7-21027041131a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.806754 4791 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.806763 4791 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.806771 4791 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.806780 4791 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.909182 4791 generic.go:334] "Generic (PLEG): container finished" podID="c70fe9d3-348d-4bb8-89f7-21027041131a" containerID="8f95a285c9f42fc1af78c5ca2e9c6d3afa217abceec616519705df06ca48c8b7" exitCode=0 Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.909361 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.909396 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" event={"ID":"c70fe9d3-348d-4bb8-89f7-21027041131a","Type":"ContainerDied","Data":"8f95a285c9f42fc1af78c5ca2e9c6d3afa217abceec616519705df06ca48c8b7"} Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.911200 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" event={"ID":"c70fe9d3-348d-4bb8-89f7-21027041131a","Type":"ContainerDied","Data":"bc92c2848641b389e30636fc84dbaa434604ffad03bf817e464c4e944f11c4ea"} Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.911251 4791 scope.go:117] "RemoveContainer" containerID="8f95a285c9f42fc1af78c5ca2e9c6d3afa217abceec616519705df06ca48c8b7" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.943247 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r8zpf"] Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.946412 4791 scope.go:117] "RemoveContainer" containerID="8f95a285c9f42fc1af78c5ca2e9c6d3afa217abceec616519705df06ca48c8b7" Feb 17 00:09:18 crc kubenswrapper[4791]: E0217 00:09:18.947143 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f95a285c9f42fc1af78c5ca2e9c6d3afa217abceec616519705df06ca48c8b7\": container with ID starting with 8f95a285c9f42fc1af78c5ca2e9c6d3afa217abceec616519705df06ca48c8b7 not found: ID does not exist" containerID="8f95a285c9f42fc1af78c5ca2e9c6d3afa217abceec616519705df06ca48c8b7" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.947227 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f95a285c9f42fc1af78c5ca2e9c6d3afa217abceec616519705df06ca48c8b7"} err="failed to get container status \"8f95a285c9f42fc1af78c5ca2e9c6d3afa217abceec616519705df06ca48c8b7\": rpc error: code = NotFound desc = could not find container \"8f95a285c9f42fc1af78c5ca2e9c6d3afa217abceec616519705df06ca48c8b7\": container with ID starting with 8f95a285c9f42fc1af78c5ca2e9c6d3afa217abceec616519705df06ca48c8b7 not found: ID does not exist" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.949301 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r8zpf"] Feb 17 00:09:19 crc kubenswrapper[4791]: I0217 00:09:19.228918 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c70fe9d3-348d-4bb8-89f7-21027041131a" path="/var/lib/kubelet/pods/c70fe9d3-348d-4bb8-89f7-21027041131a/volumes" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.607796 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-79d78bfd77-v4r9g"] Feb 17 00:09:24 crc kubenswrapper[4791]: E0217 00:09:24.608948 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c70fe9d3-348d-4bb8-89f7-21027041131a" containerName="oauth-openshift" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.609052 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="c70fe9d3-348d-4bb8-89f7-21027041131a" containerName="oauth-openshift" Feb 17 00:09:24 crc kubenswrapper[4791]: E0217 00:09:24.609159 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc437e64-8eee-418b-83d2-f79578cec0fe" containerName="registry-server" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.609247 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc437e64-8eee-418b-83d2-f79578cec0fe" containerName="registry-server" Feb 17 00:09:24 crc kubenswrapper[4791]: E0217 00:09:24.609341 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9895217e-9934-4f80-a583-98842d597690" containerName="registry-server" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.609431 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="9895217e-9934-4f80-a583-98842d597690" containerName="registry-server" Feb 17 00:09:24 crc kubenswrapper[4791]: E0217 00:09:24.609530 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9895217e-9934-4f80-a583-98842d597690" containerName="extract-content" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.609607 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="9895217e-9934-4f80-a583-98842d597690" containerName="extract-content" Feb 17 00:09:24 crc kubenswrapper[4791]: E0217 00:09:24.609689 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="509fdefa-10b0-4752-b844-843ed9e7106d" containerName="extract-utilities" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.609767 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="509fdefa-10b0-4752-b844-843ed9e7106d" containerName="extract-utilities" Feb 17 00:09:24 crc kubenswrapper[4791]: E0217 00:09:24.609843 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="509fdefa-10b0-4752-b844-843ed9e7106d" containerName="extract-content" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.609914 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="509fdefa-10b0-4752-b844-843ed9e7106d" containerName="extract-content" Feb 17 00:09:24 crc kubenswrapper[4791]: E0217 00:09:24.609992 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc437e64-8eee-418b-83d2-f79578cec0fe" containerName="extract-content" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.610072 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc437e64-8eee-418b-83d2-f79578cec0fe" containerName="extract-content" Feb 17 00:09:24 crc kubenswrapper[4791]: E0217 00:09:24.610175 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9895217e-9934-4f80-a583-98842d597690" containerName="extract-utilities" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.610273 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="9895217e-9934-4f80-a583-98842d597690" containerName="extract-utilities" Feb 17 00:09:24 crc kubenswrapper[4791]: E0217 00:09:24.610351 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="509fdefa-10b0-4752-b844-843ed9e7106d" containerName="registry-server" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.610420 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="509fdefa-10b0-4752-b844-843ed9e7106d" containerName="registry-server" Feb 17 00:09:24 crc kubenswrapper[4791]: E0217 00:09:24.610497 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc437e64-8eee-418b-83d2-f79578cec0fe" containerName="extract-utilities" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.610577 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc437e64-8eee-418b-83d2-f79578cec0fe" containerName="extract-utilities" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.610778 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="9895217e-9934-4f80-a583-98842d597690" containerName="registry-server" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.610861 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="c70fe9d3-348d-4bb8-89f7-21027041131a" containerName="oauth-openshift" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.610923 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc437e64-8eee-418b-83d2-f79578cec0fe" containerName="registry-server" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.610984 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="509fdefa-10b0-4752-b844-843ed9e7106d" containerName="registry-server" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.611496 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.615796 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.615810 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.616035 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.616341 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.616371 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.616732 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.617514 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.617667 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.618552 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.618683 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.618787 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.619271 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-79d78bfd77-v4r9g"] Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.622701 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.627354 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.628757 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.633193 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.798809 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.798870 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-system-router-certs\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.798907 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-system-service-ca\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.798932 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1917d075-c5be-48b4-baa3-a25bc8a6655b-audit-dir\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.798951 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-user-template-login\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.798976 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.799102 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-system-session\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.799193 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k58nk\" (UniqueName: \"kubernetes.io/projected/1917d075-c5be-48b4-baa3-a25bc8a6655b-kube-api-access-k58nk\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.799230 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.799372 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.799491 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.799523 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.799562 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-user-template-error\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.799594 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1917d075-c5be-48b4-baa3-a25bc8a6655b-audit-policies\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.900004 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.900051 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.900075 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-user-template-error\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.900095 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1917d075-c5be-48b4-baa3-a25bc8a6655b-audit-policies\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.900132 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.900147 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-system-router-certs\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.900171 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-system-service-ca\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.900195 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1917d075-c5be-48b4-baa3-a25bc8a6655b-audit-dir\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.900221 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-user-template-login\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.900249 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.900271 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-system-session\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.900291 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k58nk\" (UniqueName: \"kubernetes.io/projected/1917d075-c5be-48b4-baa3-a25bc8a6655b-kube-api-access-k58nk\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.900315 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.900347 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.901022 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1917d075-c5be-48b4-baa3-a25bc8a6655b-audit-dir\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.901294 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-system-service-ca\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.901562 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.901717 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1917d075-c5be-48b4-baa3-a25bc8a6655b-audit-policies\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.901817 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.905883 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.906010 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.907310 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.907509 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-user-template-login\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.908451 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.908975 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-system-session\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.912851 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-user-template-error\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.918514 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k58nk\" (UniqueName: \"kubernetes.io/projected/1917d075-c5be-48b4-baa3-a25bc8a6655b-kube-api-access-k58nk\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.919594 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-system-router-certs\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.973381 4791 patch_prober.go:28] interesting pod/machine-config-daemon-9klkw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.973453 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.973506 4791 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.974051 4791 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28"} pod="openshift-machine-config-operator/machine-config-daemon-9klkw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.974186 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" containerID="cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28" gracePeriod=600 Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.979731 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:25 crc kubenswrapper[4791]: I0217 00:09:25.166690 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-79d78bfd77-v4r9g"] Feb 17 00:09:25 crc kubenswrapper[4791]: I0217 00:09:25.954784 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" event={"ID":"1917d075-c5be-48b4-baa3-a25bc8a6655b","Type":"ContainerStarted","Data":"ee5a27a28e2f585a97fd3e167f824e5269f315e80111e803154c9780bee80889"} Feb 17 00:09:25 crc kubenswrapper[4791]: I0217 00:09:25.955448 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:25 crc kubenswrapper[4791]: I0217 00:09:25.955469 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" event={"ID":"1917d075-c5be-48b4-baa3-a25bc8a6655b","Type":"ContainerStarted","Data":"60315c44462d35ce29a4625b1eb5cd20cf0f03f317abd9826169edbb52be1877"} Feb 17 00:09:25 crc kubenswrapper[4791]: I0217 00:09:25.957323 4791 generic.go:334] "Generic (PLEG): container finished" podID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerID="7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28" exitCode=0 Feb 17 00:09:25 crc kubenswrapper[4791]: I0217 00:09:25.957423 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" event={"ID":"02a3a228-86d6-4d54-ad63-0d36c9d59af5","Type":"ContainerDied","Data":"7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28"} Feb 17 00:09:25 crc kubenswrapper[4791]: I0217 00:09:25.957613 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" event={"ID":"02a3a228-86d6-4d54-ad63-0d36c9d59af5","Type":"ContainerStarted","Data":"9f0176ad0d56ad2a09e7895c1ba79efcf2efe001559c2c246d6cab82cbcdac2f"} Feb 17 00:09:25 crc kubenswrapper[4791]: I0217 00:09:25.963132 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:25 crc kubenswrapper[4791]: I0217 00:09:25.982123 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" podStartSLOduration=32.982085798 podStartE2EDuration="32.982085798s" podCreationTimestamp="2026-02-17 00:08:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:09:25.978611136 +0000 UTC m=+223.458123673" watchObservedRunningTime="2026-02-17 00:09:25.982085798 +0000 UTC m=+223.461598325" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.423229 4791 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.424484 4791 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.424722 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118" gracePeriod=15 Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.424774 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.424834 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0" gracePeriod=15 Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.424863 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef" gracePeriod=15 Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.424882 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c" gracePeriod=15 Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.424798 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373" gracePeriod=15 Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.426529 4791 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 00:09:37 crc kubenswrapper[4791]: E0217 00:09:37.427011 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.427035 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 00:09:37 crc kubenswrapper[4791]: E0217 00:09:37.427049 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.427151 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 17 00:09:37 crc kubenswrapper[4791]: E0217 00:09:37.427165 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.427172 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 17 00:09:37 crc kubenswrapper[4791]: E0217 00:09:37.427180 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.427187 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 17 00:09:37 crc kubenswrapper[4791]: E0217 00:09:37.427195 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.427222 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 00:09:37 crc kubenswrapper[4791]: E0217 00:09:37.427232 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.427238 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 17 00:09:37 crc kubenswrapper[4791]: E0217 00:09:37.427245 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.427253 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.427389 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.427400 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.427408 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.427419 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.427428 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.427437 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.585967 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.586324 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.586400 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.586502 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.586553 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.586594 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.586616 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.586708 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.688308 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.688371 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.688398 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.688426 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.688426 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.688481 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.688492 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.688451 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.688549 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.688572 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.688505 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.688584 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.688637 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.688612 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.688672 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.688634 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 00:09:38 crc kubenswrapper[4791]: I0217 00:09:38.030776 4791 generic.go:334] "Generic (PLEG): container finished" podID="45d65fd1-6366-4ed0-bc40-10d5418435ea" containerID="625e883c4175a7bb832ef122a4de927b44b76c4d7352658a36f562332f5bdbaa" exitCode=0 Feb 17 00:09:38 crc kubenswrapper[4791]: I0217 00:09:38.030891 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"45d65fd1-6366-4ed0-bc40-10d5418435ea","Type":"ContainerDied","Data":"625e883c4175a7bb832ef122a4de927b44b76c4d7352658a36f562332f5bdbaa"} Feb 17 00:09:38 crc kubenswrapper[4791]: I0217 00:09:38.032461 4791 status_manager.go:851] "Failed to get status for pod" podUID="45d65fd1-6366-4ed0-bc40-10d5418435ea" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:38 crc kubenswrapper[4791]: I0217 00:09:38.032924 4791 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:38 crc kubenswrapper[4791]: I0217 00:09:38.033749 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 17 00:09:38 crc kubenswrapper[4791]: I0217 00:09:38.034851 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 00:09:38 crc kubenswrapper[4791]: I0217 00:09:38.035740 4791 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef" exitCode=0 Feb 17 00:09:38 crc kubenswrapper[4791]: I0217 00:09:38.035869 4791 scope.go:117] "RemoveContainer" containerID="f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c" Feb 17 00:09:38 crc kubenswrapper[4791]: I0217 00:09:38.035927 4791 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373" exitCode=0 Feb 17 00:09:38 crc kubenswrapper[4791]: I0217 00:09:38.036065 4791 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0" exitCode=0 Feb 17 00:09:38 crc kubenswrapper[4791]: I0217 00:09:38.036090 4791 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c" exitCode=2 Feb 17 00:09:38 crc kubenswrapper[4791]: I0217 00:09:38.122971 4791 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" start-of-body= Feb 17 00:09:38 crc kubenswrapper[4791]: I0217 00:09:38.123059 4791 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" Feb 17 00:09:39 crc kubenswrapper[4791]: I0217 00:09:39.842629 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.030372 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.031243 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.032790 4791 status_manager.go:851] "Failed to get status for pod" podUID="45d65fd1-6366-4ed0-bc40-10d5418435ea" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.033602 4791 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.133834 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.134637 4791 status_manager.go:851] "Failed to get status for pod" podUID="45d65fd1-6366-4ed0-bc40-10d5418435ea" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.135128 4791 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.221404 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.221508 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.221512 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.221551 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.221608 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.221679 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.221873 4791 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.221893 4791 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.221902 4791 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.322689 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/45d65fd1-6366-4ed0-bc40-10d5418435ea-var-lock\") pod \"45d65fd1-6366-4ed0-bc40-10d5418435ea\" (UID: \"45d65fd1-6366-4ed0-bc40-10d5418435ea\") " Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.322750 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/45d65fd1-6366-4ed0-bc40-10d5418435ea-kubelet-dir\") pod \"45d65fd1-6366-4ed0-bc40-10d5418435ea\" (UID: \"45d65fd1-6366-4ed0-bc40-10d5418435ea\") " Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.322793 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/45d65fd1-6366-4ed0-bc40-10d5418435ea-kube-api-access\") pod \"45d65fd1-6366-4ed0-bc40-10d5418435ea\" (UID: \"45d65fd1-6366-4ed0-bc40-10d5418435ea\") " Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.322807 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45d65fd1-6366-4ed0-bc40-10d5418435ea-var-lock" (OuterVolumeSpecName: "var-lock") pod "45d65fd1-6366-4ed0-bc40-10d5418435ea" (UID: "45d65fd1-6366-4ed0-bc40-10d5418435ea"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.322872 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45d65fd1-6366-4ed0-bc40-10d5418435ea-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "45d65fd1-6366-4ed0-bc40-10d5418435ea" (UID: "45d65fd1-6366-4ed0-bc40-10d5418435ea"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.323163 4791 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/45d65fd1-6366-4ed0-bc40-10d5418435ea-var-lock\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.323184 4791 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/45d65fd1-6366-4ed0-bc40-10d5418435ea-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.327667 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45d65fd1-6366-4ed0-bc40-10d5418435ea-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "45d65fd1-6366-4ed0-bc40-10d5418435ea" (UID: "45d65fd1-6366-4ed0-bc40-10d5418435ea"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.424081 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/45d65fd1-6366-4ed0-bc40-10d5418435ea-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.853436 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.854443 4791 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118" exitCode=0 Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.854544 4791 scope.go:117] "RemoveContainer" containerID="4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.854555 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.857170 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.857317 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"45d65fd1-6366-4ed0-bc40-10d5418435ea","Type":"ContainerDied","Data":"e34b1921e9f7cc6a8f4baa20b25186c7eb9d2fa742e706dd1bd3c235046f30fd"} Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.857466 4791 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e34b1921e9f7cc6a8f4baa20b25186c7eb9d2fa742e706dd1bd3c235046f30fd" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.872981 4791 scope.go:117] "RemoveContainer" containerID="2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.876285 4791 status_manager.go:851] "Failed to get status for pod" podUID="45d65fd1-6366-4ed0-bc40-10d5418435ea" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.876814 4791 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.884535 4791 status_manager.go:851] "Failed to get status for pod" podUID="45d65fd1-6366-4ed0-bc40-10d5418435ea" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.885942 4791 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.892873 4791 scope.go:117] "RemoveContainer" containerID="878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.907502 4791 scope.go:117] "RemoveContainer" containerID="ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.919216 4791 scope.go:117] "RemoveContainer" containerID="8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.936410 4791 scope.go:117] "RemoveContainer" containerID="8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.976064 4791 scope.go:117] "RemoveContainer" containerID="4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef" Feb 17 00:09:40 crc kubenswrapper[4791]: E0217 00:09:40.976512 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\": container with ID starting with 4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef not found: ID does not exist" containerID="4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.976858 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef"} err="failed to get container status \"4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\": rpc error: code = NotFound desc = could not find container \"4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\": container with ID starting with 4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef not found: ID does not exist" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.976949 4791 scope.go:117] "RemoveContainer" containerID="2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373" Feb 17 00:09:40 crc kubenswrapper[4791]: E0217 00:09:40.977202 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\": container with ID starting with 2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373 not found: ID does not exist" containerID="2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.977275 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373"} err="failed to get container status \"2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\": rpc error: code = NotFound desc = could not find container \"2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\": container with ID starting with 2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373 not found: ID does not exist" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.977351 4791 scope.go:117] "RemoveContainer" containerID="878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0" Feb 17 00:09:40 crc kubenswrapper[4791]: E0217 00:09:40.977571 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\": container with ID starting with 878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0 not found: ID does not exist" containerID="878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.977652 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0"} err="failed to get container status \"878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\": rpc error: code = NotFound desc = could not find container \"878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\": container with ID starting with 878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0 not found: ID does not exist" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.977775 4791 scope.go:117] "RemoveContainer" containerID="ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c" Feb 17 00:09:40 crc kubenswrapper[4791]: E0217 00:09:40.978081 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\": container with ID starting with ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c not found: ID does not exist" containerID="ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.978161 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c"} err="failed to get container status \"ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\": rpc error: code = NotFound desc = could not find container \"ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\": container with ID starting with ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c not found: ID does not exist" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.978233 4791 scope.go:117] "RemoveContainer" containerID="8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118" Feb 17 00:09:40 crc kubenswrapper[4791]: E0217 00:09:40.978950 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\": container with ID starting with 8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118 not found: ID does not exist" containerID="8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.979027 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118"} err="failed to get container status \"8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\": rpc error: code = NotFound desc = could not find container \"8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\": container with ID starting with 8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118 not found: ID does not exist" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.979099 4791 scope.go:117] "RemoveContainer" containerID="8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf" Feb 17 00:09:40 crc kubenswrapper[4791]: E0217 00:09:40.979727 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\": container with ID starting with 8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf not found: ID does not exist" containerID="8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.979844 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf"} err="failed to get container status \"8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\": rpc error: code = NotFound desc = could not find container \"8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\": container with ID starting with 8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf not found: ID does not exist" Feb 17 00:09:41 crc kubenswrapper[4791]: I0217 00:09:41.235899 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 17 00:09:42 crc kubenswrapper[4791]: E0217 00:09:42.331167 4791 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:42 crc kubenswrapper[4791]: E0217 00:09:42.331907 4791 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:42 crc kubenswrapper[4791]: E0217 00:09:42.332269 4791 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:42 crc kubenswrapper[4791]: E0217 00:09:42.332603 4791 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:42 crc kubenswrapper[4791]: E0217 00:09:42.333044 4791 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:42 crc kubenswrapper[4791]: I0217 00:09:42.333078 4791 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 17 00:09:42 crc kubenswrapper[4791]: E0217 00:09:42.333345 4791 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="200ms" Feb 17 00:09:42 crc kubenswrapper[4791]: E0217 00:09:42.455404 4791 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.224:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 00:09:42 crc kubenswrapper[4791]: I0217 00:09:42.456049 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 00:09:42 crc kubenswrapper[4791]: W0217 00:09:42.486745 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-4fe5cf9852d4117d643ccc9b78a202850726325147caa3d91b119dfa015acba3 WatchSource:0}: Error finding container 4fe5cf9852d4117d643ccc9b78a202850726325147caa3d91b119dfa015acba3: Status 404 returned error can't find the container with id 4fe5cf9852d4117d643ccc9b78a202850726325147caa3d91b119dfa015acba3 Feb 17 00:09:42 crc kubenswrapper[4791]: E0217 00:09:42.489944 4791 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.224:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1894e0203a8ccd1a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-17 00:09:42.489328922 +0000 UTC m=+239.968841449,LastTimestamp:2026-02-17 00:09:42.489328922 +0000 UTC m=+239.968841449,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 17 00:09:42 crc kubenswrapper[4791]: E0217 00:09:42.534803 4791 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="400ms" Feb 17 00:09:42 crc kubenswrapper[4791]: I0217 00:09:42.870761 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"cd581d147e2a01123d1426c686ebf38ddd0bd82af7a233296466aaec95b07c9b"} Feb 17 00:09:42 crc kubenswrapper[4791]: I0217 00:09:42.870821 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"4fe5cf9852d4117d643ccc9b78a202850726325147caa3d91b119dfa015acba3"} Feb 17 00:09:42 crc kubenswrapper[4791]: I0217 00:09:42.871609 4791 status_manager.go:851] "Failed to get status for pod" podUID="45d65fd1-6366-4ed0-bc40-10d5418435ea" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:42 crc kubenswrapper[4791]: E0217 00:09:42.871752 4791 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.224:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 00:09:42 crc kubenswrapper[4791]: E0217 00:09:42.936546 4791 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="800ms" Feb 17 00:09:43 crc kubenswrapper[4791]: I0217 00:09:43.224011 4791 status_manager.go:851] "Failed to get status for pod" podUID="45d65fd1-6366-4ed0-bc40-10d5418435ea" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:43 crc kubenswrapper[4791]: E0217 00:09:43.738950 4791 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="1.6s" Feb 17 00:09:44 crc kubenswrapper[4791]: E0217 00:09:44.052326 4791 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.224:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1894e0203a8ccd1a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-17 00:09:42.489328922 +0000 UTC m=+239.968841449,LastTimestamp:2026-02-17 00:09:42.489328922 +0000 UTC m=+239.968841449,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 17 00:09:45 crc kubenswrapper[4791]: E0217 00:09:45.340902 4791 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="3.2s" Feb 17 00:09:48 crc kubenswrapper[4791]: E0217 00:09:48.541755 4791 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="6.4s" Feb 17 00:09:50 crc kubenswrapper[4791]: I0217 00:09:50.920225 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 17 00:09:50 crc kubenswrapper[4791]: I0217 00:09:50.920568 4791 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86" exitCode=1 Feb 17 00:09:50 crc kubenswrapper[4791]: I0217 00:09:50.920615 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86"} Feb 17 00:09:50 crc kubenswrapper[4791]: I0217 00:09:50.921469 4791 scope.go:117] "RemoveContainer" containerID="d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86" Feb 17 00:09:50 crc kubenswrapper[4791]: I0217 00:09:50.921845 4791 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:50 crc kubenswrapper[4791]: I0217 00:09:50.922547 4791 status_manager.go:851] "Failed to get status for pod" podUID="45d65fd1-6366-4ed0-bc40-10d5418435ea" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:51 crc kubenswrapper[4791]: I0217 00:09:51.219507 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:09:51 crc kubenswrapper[4791]: I0217 00:09:51.220701 4791 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:51 crc kubenswrapper[4791]: I0217 00:09:51.221439 4791 status_manager.go:851] "Failed to get status for pod" podUID="45d65fd1-6366-4ed0-bc40-10d5418435ea" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:51 crc kubenswrapper[4791]: I0217 00:09:51.239151 4791 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9c4455fb-f818-4b8a-92dc-ac60431b5ff8" Feb 17 00:09:51 crc kubenswrapper[4791]: I0217 00:09:51.239189 4791 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9c4455fb-f818-4b8a-92dc-ac60431b5ff8" Feb 17 00:09:51 crc kubenswrapper[4791]: E0217 00:09:51.239629 4791 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:09:51 crc kubenswrapper[4791]: I0217 00:09:51.240181 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:09:51 crc kubenswrapper[4791]: W0217 00:09:51.262490 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-2fcb52200f512e5ac2a434dba738911a325516376ae69a3b25c7b55a69b58026 WatchSource:0}: Error finding container 2fcb52200f512e5ac2a434dba738911a325516376ae69a3b25c7b55a69b58026: Status 404 returned error can't find the container with id 2fcb52200f512e5ac2a434dba738911a325516376ae69a3b25c7b55a69b58026 Feb 17 00:09:51 crc kubenswrapper[4791]: I0217 00:09:51.931179 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 17 00:09:51 crc kubenswrapper[4791]: I0217 00:09:51.931312 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ae95a0fefe7b673983f1d92ddb64d23be5f53d379dce7ac00415b0758451432d"} Feb 17 00:09:51 crc kubenswrapper[4791]: I0217 00:09:51.936788 4791 status_manager.go:851] "Failed to get status for pod" podUID="45d65fd1-6366-4ed0-bc40-10d5418435ea" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:51 crc kubenswrapper[4791]: I0217 00:09:51.937143 4791 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:51 crc kubenswrapper[4791]: I0217 00:09:51.948769 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"bbb2e296ccc25b602f66a53d548d184bd8e5080c64c88ed3bd87081bfb7eadfe"} Feb 17 00:09:51 crc kubenswrapper[4791]: I0217 00:09:51.948903 4791 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="bbb2e296ccc25b602f66a53d548d184bd8e5080c64c88ed3bd87081bfb7eadfe" exitCode=0 Feb 17 00:09:51 crc kubenswrapper[4791]: I0217 00:09:51.948954 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2fcb52200f512e5ac2a434dba738911a325516376ae69a3b25c7b55a69b58026"} Feb 17 00:09:51 crc kubenswrapper[4791]: I0217 00:09:51.950037 4791 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9c4455fb-f818-4b8a-92dc-ac60431b5ff8" Feb 17 00:09:51 crc kubenswrapper[4791]: I0217 00:09:51.950071 4791 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9c4455fb-f818-4b8a-92dc-ac60431b5ff8" Feb 17 00:09:51 crc kubenswrapper[4791]: E0217 00:09:51.950912 4791 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:09:51 crc kubenswrapper[4791]: I0217 00:09:51.954320 4791 status_manager.go:851] "Failed to get status for pod" podUID="45d65fd1-6366-4ed0-bc40-10d5418435ea" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:51 crc kubenswrapper[4791]: I0217 00:09:51.954900 4791 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:52 crc kubenswrapper[4791]: I0217 00:09:52.958708 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1993b99b6471a61be1e29e24a843a3e24b3b94f6dce4054676a1305e3a708eba"} Feb 17 00:09:52 crc kubenswrapper[4791]: I0217 00:09:52.959662 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3e5acb7f7789ad110aa8db7ba145560490b34b9358b207d27d4629c2a950d708"} Feb 17 00:09:52 crc kubenswrapper[4791]: I0217 00:09:52.959740 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3565e93f71e21d3ebbec12d754879bbce24af23ef089b372174f14e53d7efa9f"} Feb 17 00:09:53 crc kubenswrapper[4791]: I0217 00:09:53.967828 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"08a2a45703fe50ae521d3213c3369f28e782672e5eb97008b0828e580448ac8d"} Feb 17 00:09:53 crc kubenswrapper[4791]: I0217 00:09:53.969192 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f125604f9d9ac1a341e6af7798be8282fbdae6e0ed3c3a7a897740559fad2602"} Feb 17 00:09:53 crc kubenswrapper[4791]: I0217 00:09:53.969288 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:09:53 crc kubenswrapper[4791]: I0217 00:09:53.968385 4791 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9c4455fb-f818-4b8a-92dc-ac60431b5ff8" Feb 17 00:09:53 crc kubenswrapper[4791]: I0217 00:09:53.969435 4791 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9c4455fb-f818-4b8a-92dc-ac60431b5ff8" Feb 17 00:09:55 crc kubenswrapper[4791]: I0217 00:09:55.401290 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 00:09:55 crc kubenswrapper[4791]: I0217 00:09:55.405708 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 00:09:55 crc kubenswrapper[4791]: I0217 00:09:55.983749 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 00:09:56 crc kubenswrapper[4791]: I0217 00:09:56.240338 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:09:56 crc kubenswrapper[4791]: I0217 00:09:56.240415 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:09:56 crc kubenswrapper[4791]: I0217 00:09:56.247788 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:09:58 crc kubenswrapper[4791]: I0217 00:09:58.977783 4791 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:09:59 crc kubenswrapper[4791]: I0217 00:09:59.001950 4791 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9c4455fb-f818-4b8a-92dc-ac60431b5ff8" Feb 17 00:09:59 crc kubenswrapper[4791]: I0217 00:09:59.001992 4791 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9c4455fb-f818-4b8a-92dc-ac60431b5ff8" Feb 17 00:09:59 crc kubenswrapper[4791]: I0217 00:09:59.005692 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:09:59 crc kubenswrapper[4791]: I0217 00:09:59.008803 4791 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="36ee7bce-4ced-431a-b5cd-18db071b5601" Feb 17 00:10:00 crc kubenswrapper[4791]: I0217 00:10:00.005944 4791 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9c4455fb-f818-4b8a-92dc-ac60431b5ff8" Feb 17 00:10:00 crc kubenswrapper[4791]: I0217 00:10:00.005983 4791 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9c4455fb-f818-4b8a-92dc-ac60431b5ff8" Feb 17 00:10:03 crc kubenswrapper[4791]: I0217 00:10:03.242663 4791 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="36ee7bce-4ced-431a-b5cd-18db071b5601" Feb 17 00:10:08 crc kubenswrapper[4791]: I0217 00:10:08.463834 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 00:10:08 crc kubenswrapper[4791]: I0217 00:10:08.945715 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 17 00:10:09 crc kubenswrapper[4791]: I0217 00:10:09.306245 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 17 00:10:09 crc kubenswrapper[4791]: I0217 00:10:09.364954 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 17 00:10:09 crc kubenswrapper[4791]: I0217 00:10:09.444582 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 17 00:10:09 crc kubenswrapper[4791]: I0217 00:10:09.604530 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 17 00:10:09 crc kubenswrapper[4791]: I0217 00:10:09.778678 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 17 00:10:09 crc kubenswrapper[4791]: I0217 00:10:09.787254 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 17 00:10:09 crc kubenswrapper[4791]: I0217 00:10:09.804741 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 17 00:10:09 crc kubenswrapper[4791]: I0217 00:10:09.978400 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 17 00:10:10 crc kubenswrapper[4791]: I0217 00:10:10.276905 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 17 00:10:10 crc kubenswrapper[4791]: I0217 00:10:10.308086 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 17 00:10:10 crc kubenswrapper[4791]: I0217 00:10:10.326403 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 17 00:10:10 crc kubenswrapper[4791]: I0217 00:10:10.510862 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 17 00:10:10 crc kubenswrapper[4791]: I0217 00:10:10.601093 4791 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 17 00:10:10 crc kubenswrapper[4791]: I0217 00:10:10.651580 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 17 00:10:10 crc kubenswrapper[4791]: I0217 00:10:10.685881 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 17 00:10:10 crc kubenswrapper[4791]: I0217 00:10:10.743584 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 17 00:10:10 crc kubenswrapper[4791]: I0217 00:10:10.951953 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 17 00:10:11 crc kubenswrapper[4791]: I0217 00:10:11.290820 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 17 00:10:11 crc kubenswrapper[4791]: I0217 00:10:11.508434 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 17 00:10:11 crc kubenswrapper[4791]: I0217 00:10:11.564217 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 17 00:10:11 crc kubenswrapper[4791]: I0217 00:10:11.574538 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 17 00:10:11 crc kubenswrapper[4791]: I0217 00:10:11.574600 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 17 00:10:11 crc kubenswrapper[4791]: I0217 00:10:11.612307 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 17 00:10:11 crc kubenswrapper[4791]: I0217 00:10:11.617932 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 17 00:10:11 crc kubenswrapper[4791]: I0217 00:10:11.657580 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 17 00:10:11 crc kubenswrapper[4791]: I0217 00:10:11.701892 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 17 00:10:11 crc kubenswrapper[4791]: I0217 00:10:11.770041 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 17 00:10:11 crc kubenswrapper[4791]: I0217 00:10:11.779802 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 17 00:10:11 crc kubenswrapper[4791]: I0217 00:10:11.817918 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 17 00:10:11 crc kubenswrapper[4791]: I0217 00:10:11.892831 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 17 00:10:11 crc kubenswrapper[4791]: I0217 00:10:11.911893 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 17 00:10:11 crc kubenswrapper[4791]: I0217 00:10:11.918866 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 17 00:10:11 crc kubenswrapper[4791]: I0217 00:10:11.943799 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 17 00:10:11 crc kubenswrapper[4791]: I0217 00:10:11.963872 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 17 00:10:11 crc kubenswrapper[4791]: I0217 00:10:11.964230 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 17 00:10:11 crc kubenswrapper[4791]: I0217 00:10:11.988972 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 17 00:10:12 crc kubenswrapper[4791]: I0217 00:10:12.294886 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 17 00:10:12 crc kubenswrapper[4791]: I0217 00:10:12.356052 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 17 00:10:12 crc kubenswrapper[4791]: I0217 00:10:12.417997 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 17 00:10:12 crc kubenswrapper[4791]: I0217 00:10:12.502258 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 17 00:10:12 crc kubenswrapper[4791]: I0217 00:10:12.571250 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 17 00:10:12 crc kubenswrapper[4791]: I0217 00:10:12.754204 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 17 00:10:12 crc kubenswrapper[4791]: I0217 00:10:12.795338 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 17 00:10:12 crc kubenswrapper[4791]: I0217 00:10:12.808504 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 17 00:10:12 crc kubenswrapper[4791]: I0217 00:10:12.842245 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 17 00:10:12 crc kubenswrapper[4791]: I0217 00:10:12.858626 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 17 00:10:12 crc kubenswrapper[4791]: I0217 00:10:12.979853 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 17 00:10:12 crc kubenswrapper[4791]: I0217 00:10:12.992263 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 17 00:10:13 crc kubenswrapper[4791]: I0217 00:10:13.039473 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 17 00:10:13 crc kubenswrapper[4791]: I0217 00:10:13.230462 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 17 00:10:13 crc kubenswrapper[4791]: I0217 00:10:13.272214 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 17 00:10:13 crc kubenswrapper[4791]: I0217 00:10:13.279423 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 17 00:10:13 crc kubenswrapper[4791]: I0217 00:10:13.401422 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 17 00:10:13 crc kubenswrapper[4791]: I0217 00:10:13.444261 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 17 00:10:13 crc kubenswrapper[4791]: I0217 00:10:13.554088 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 17 00:10:13 crc kubenswrapper[4791]: I0217 00:10:13.745604 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 17 00:10:13 crc kubenswrapper[4791]: I0217 00:10:13.760243 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 17 00:10:13 crc kubenswrapper[4791]: I0217 00:10:13.824954 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 17 00:10:13 crc kubenswrapper[4791]: I0217 00:10:13.838038 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 17 00:10:13 crc kubenswrapper[4791]: I0217 00:10:13.843931 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 17 00:10:13 crc kubenswrapper[4791]: I0217 00:10:13.860831 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 17 00:10:13 crc kubenswrapper[4791]: I0217 00:10:13.868530 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 17 00:10:13 crc kubenswrapper[4791]: I0217 00:10:13.892776 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 17 00:10:13 crc kubenswrapper[4791]: I0217 00:10:13.929380 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 17 00:10:14 crc kubenswrapper[4791]: I0217 00:10:14.111884 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 17 00:10:14 crc kubenswrapper[4791]: I0217 00:10:14.114745 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 17 00:10:14 crc kubenswrapper[4791]: I0217 00:10:14.182645 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 17 00:10:14 crc kubenswrapper[4791]: I0217 00:10:14.182656 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 17 00:10:14 crc kubenswrapper[4791]: I0217 00:10:14.315749 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 17 00:10:14 crc kubenswrapper[4791]: I0217 00:10:14.343814 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 17 00:10:14 crc kubenswrapper[4791]: I0217 00:10:14.379545 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 17 00:10:14 crc kubenswrapper[4791]: I0217 00:10:14.399066 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 17 00:10:14 crc kubenswrapper[4791]: I0217 00:10:14.459874 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 17 00:10:14 crc kubenswrapper[4791]: I0217 00:10:14.538760 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 17 00:10:14 crc kubenswrapper[4791]: I0217 00:10:14.571797 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 17 00:10:14 crc kubenswrapper[4791]: I0217 00:10:14.592396 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 17 00:10:14 crc kubenswrapper[4791]: I0217 00:10:14.600771 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 17 00:10:14 crc kubenswrapper[4791]: I0217 00:10:14.716835 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 17 00:10:14 crc kubenswrapper[4791]: I0217 00:10:14.744230 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 17 00:10:14 crc kubenswrapper[4791]: I0217 00:10:14.795040 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 17 00:10:14 crc kubenswrapper[4791]: I0217 00:10:14.807220 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 17 00:10:14 crc kubenswrapper[4791]: I0217 00:10:14.821585 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 17 00:10:14 crc kubenswrapper[4791]: I0217 00:10:14.839786 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 17 00:10:14 crc kubenswrapper[4791]: I0217 00:10:14.884077 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 17 00:10:14 crc kubenswrapper[4791]: I0217 00:10:14.890277 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 17 00:10:14 crc kubenswrapper[4791]: I0217 00:10:14.955523 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 17 00:10:14 crc kubenswrapper[4791]: I0217 00:10:14.971836 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 17 00:10:14 crc kubenswrapper[4791]: I0217 00:10:14.996727 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 17 00:10:15 crc kubenswrapper[4791]: I0217 00:10:15.013780 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 17 00:10:15 crc kubenswrapper[4791]: I0217 00:10:15.027795 4791 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 17 00:10:15 crc kubenswrapper[4791]: I0217 00:10:15.032613 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 17 00:10:15 crc kubenswrapper[4791]: I0217 00:10:15.088376 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 17 00:10:15 crc kubenswrapper[4791]: I0217 00:10:15.136664 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 17 00:10:15 crc kubenswrapper[4791]: I0217 00:10:15.254966 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 17 00:10:15 crc kubenswrapper[4791]: I0217 00:10:15.277322 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 17 00:10:15 crc kubenswrapper[4791]: I0217 00:10:15.395672 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 17 00:10:15 crc kubenswrapper[4791]: I0217 00:10:15.492816 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 17 00:10:15 crc kubenswrapper[4791]: I0217 00:10:15.695284 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 17 00:10:15 crc kubenswrapper[4791]: I0217 00:10:15.705924 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 17 00:10:15 crc kubenswrapper[4791]: I0217 00:10:15.713514 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 17 00:10:15 crc kubenswrapper[4791]: I0217 00:10:15.785894 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 17 00:10:15 crc kubenswrapper[4791]: I0217 00:10:15.812245 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 17 00:10:15 crc kubenswrapper[4791]: I0217 00:10:15.877676 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 17 00:10:15 crc kubenswrapper[4791]: I0217 00:10:15.993824 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 17 00:10:16 crc kubenswrapper[4791]: I0217 00:10:16.106072 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 17 00:10:16 crc kubenswrapper[4791]: I0217 00:10:16.160954 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 17 00:10:16 crc kubenswrapper[4791]: I0217 00:10:16.292270 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 17 00:10:16 crc kubenswrapper[4791]: I0217 00:10:16.442389 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 17 00:10:16 crc kubenswrapper[4791]: I0217 00:10:16.710237 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 17 00:10:16 crc kubenswrapper[4791]: I0217 00:10:16.723782 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 17 00:10:16 crc kubenswrapper[4791]: I0217 00:10:16.770562 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 17 00:10:16 crc kubenswrapper[4791]: I0217 00:10:16.780471 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 17 00:10:16 crc kubenswrapper[4791]: I0217 00:10:16.825068 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 17 00:10:16 crc kubenswrapper[4791]: I0217 00:10:16.860175 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 17 00:10:16 crc kubenswrapper[4791]: I0217 00:10:16.882816 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.012664 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.022999 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.069587 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.077440 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.100925 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.121385 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.201423 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.202659 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.218632 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.347825 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.380557 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.443690 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.534925 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.570866 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.581961 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.633527 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.653756 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.711193 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.744390 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.748302 4791 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.755338 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.755423 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.761052 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.777401 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=19.777345548 podStartE2EDuration="19.777345548s" podCreationTimestamp="2026-02-17 00:09:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:10:17.775370505 +0000 UTC m=+275.254883032" watchObservedRunningTime="2026-02-17 00:10:17.777345548 +0000 UTC m=+275.256858075" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.808623 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.828982 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.855629 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.905819 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.977853 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.985517 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 17 00:10:18 crc kubenswrapper[4791]: I0217 00:10:18.129518 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 17 00:10:18 crc kubenswrapper[4791]: I0217 00:10:18.152225 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 17 00:10:18 crc kubenswrapper[4791]: I0217 00:10:18.167573 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 17 00:10:18 crc kubenswrapper[4791]: I0217 00:10:18.171589 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 17 00:10:18 crc kubenswrapper[4791]: I0217 00:10:18.174363 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 17 00:10:18 crc kubenswrapper[4791]: I0217 00:10:18.175003 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 17 00:10:18 crc kubenswrapper[4791]: I0217 00:10:18.191572 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 17 00:10:18 crc kubenswrapper[4791]: I0217 00:10:18.192043 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 17 00:10:18 crc kubenswrapper[4791]: I0217 00:10:18.244627 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 17 00:10:18 crc kubenswrapper[4791]: I0217 00:10:18.263190 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 17 00:10:18 crc kubenswrapper[4791]: I0217 00:10:18.308244 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 17 00:10:18 crc kubenswrapper[4791]: I0217 00:10:18.362433 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 17 00:10:18 crc kubenswrapper[4791]: I0217 00:10:18.375632 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 17 00:10:18 crc kubenswrapper[4791]: I0217 00:10:18.428391 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 17 00:10:18 crc kubenswrapper[4791]: I0217 00:10:18.484279 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 17 00:10:18 crc kubenswrapper[4791]: I0217 00:10:18.610911 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 17 00:10:18 crc kubenswrapper[4791]: I0217 00:10:18.628776 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 17 00:10:18 crc kubenswrapper[4791]: I0217 00:10:18.644088 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 17 00:10:18 crc kubenswrapper[4791]: I0217 00:10:18.644365 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 17 00:10:18 crc kubenswrapper[4791]: I0217 00:10:18.652310 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 17 00:10:18 crc kubenswrapper[4791]: I0217 00:10:18.851062 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 17 00:10:18 crc kubenswrapper[4791]: I0217 00:10:18.870247 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 17 00:10:18 crc kubenswrapper[4791]: I0217 00:10:18.935681 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 17 00:10:18 crc kubenswrapper[4791]: I0217 00:10:18.979165 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 17 00:10:19 crc kubenswrapper[4791]: I0217 00:10:19.109039 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 17 00:10:19 crc kubenswrapper[4791]: I0217 00:10:19.123946 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 17 00:10:19 crc kubenswrapper[4791]: I0217 00:10:19.162927 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 17 00:10:19 crc kubenswrapper[4791]: I0217 00:10:19.172616 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 17 00:10:19 crc kubenswrapper[4791]: I0217 00:10:19.220711 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 17 00:10:19 crc kubenswrapper[4791]: I0217 00:10:19.360755 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 17 00:10:19 crc kubenswrapper[4791]: I0217 00:10:19.379742 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 17 00:10:19 crc kubenswrapper[4791]: I0217 00:10:19.399617 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 17 00:10:19 crc kubenswrapper[4791]: I0217 00:10:19.452418 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 17 00:10:19 crc kubenswrapper[4791]: I0217 00:10:19.454520 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 17 00:10:19 crc kubenswrapper[4791]: I0217 00:10:19.485721 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 17 00:10:19 crc kubenswrapper[4791]: I0217 00:10:19.489716 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 17 00:10:19 crc kubenswrapper[4791]: I0217 00:10:19.593290 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 17 00:10:19 crc kubenswrapper[4791]: I0217 00:10:19.609267 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 17 00:10:19 crc kubenswrapper[4791]: I0217 00:10:19.612633 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 17 00:10:19 crc kubenswrapper[4791]: I0217 00:10:19.632780 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 17 00:10:19 crc kubenswrapper[4791]: I0217 00:10:19.644859 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 17 00:10:19 crc kubenswrapper[4791]: I0217 00:10:19.650011 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 17 00:10:19 crc kubenswrapper[4791]: I0217 00:10:19.655208 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 17 00:10:19 crc kubenswrapper[4791]: I0217 00:10:19.708684 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 17 00:10:19 crc kubenswrapper[4791]: I0217 00:10:19.717894 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 17 00:10:19 crc kubenswrapper[4791]: I0217 00:10:19.719493 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 17 00:10:19 crc kubenswrapper[4791]: I0217 00:10:19.768743 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 17 00:10:19 crc kubenswrapper[4791]: I0217 00:10:19.963464 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 17 00:10:20 crc kubenswrapper[4791]: I0217 00:10:20.115593 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 17 00:10:20 crc kubenswrapper[4791]: I0217 00:10:20.126819 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 17 00:10:20 crc kubenswrapper[4791]: I0217 00:10:20.285287 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 17 00:10:20 crc kubenswrapper[4791]: I0217 00:10:20.303731 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 17 00:10:20 crc kubenswrapper[4791]: I0217 00:10:20.479042 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 17 00:10:20 crc kubenswrapper[4791]: I0217 00:10:20.592263 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 17 00:10:20 crc kubenswrapper[4791]: I0217 00:10:20.638231 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 17 00:10:20 crc kubenswrapper[4791]: I0217 00:10:20.677381 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 17 00:10:20 crc kubenswrapper[4791]: I0217 00:10:20.702770 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 17 00:10:20 crc kubenswrapper[4791]: I0217 00:10:20.726553 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 17 00:10:20 crc kubenswrapper[4791]: I0217 00:10:20.735628 4791 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 17 00:10:20 crc kubenswrapper[4791]: I0217 00:10:20.742490 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 17 00:10:20 crc kubenswrapper[4791]: I0217 00:10:20.821823 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 17 00:10:20 crc kubenswrapper[4791]: I0217 00:10:20.829026 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 17 00:10:20 crc kubenswrapper[4791]: I0217 00:10:20.996466 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 17 00:10:21 crc kubenswrapper[4791]: I0217 00:10:21.005464 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 17 00:10:21 crc kubenswrapper[4791]: I0217 00:10:21.160634 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 17 00:10:21 crc kubenswrapper[4791]: I0217 00:10:21.194217 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 17 00:10:21 crc kubenswrapper[4791]: I0217 00:10:21.253049 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 17 00:10:21 crc kubenswrapper[4791]: I0217 00:10:21.344256 4791 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 17 00:10:21 crc kubenswrapper[4791]: I0217 00:10:21.344520 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://cd581d147e2a01123d1426c686ebf38ddd0bd82af7a233296466aaec95b07c9b" gracePeriod=5 Feb 17 00:10:21 crc kubenswrapper[4791]: I0217 00:10:21.355624 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 17 00:10:21 crc kubenswrapper[4791]: I0217 00:10:21.455493 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 17 00:10:21 crc kubenswrapper[4791]: I0217 00:10:21.616840 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 17 00:10:21 crc kubenswrapper[4791]: I0217 00:10:21.646880 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 17 00:10:21 crc kubenswrapper[4791]: I0217 00:10:21.659162 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 17 00:10:21 crc kubenswrapper[4791]: I0217 00:10:21.827067 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 17 00:10:22 crc kubenswrapper[4791]: I0217 00:10:22.085606 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 17 00:10:22 crc kubenswrapper[4791]: I0217 00:10:22.107256 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 17 00:10:22 crc kubenswrapper[4791]: I0217 00:10:22.329925 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 17 00:10:22 crc kubenswrapper[4791]: I0217 00:10:22.345288 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 17 00:10:22 crc kubenswrapper[4791]: I0217 00:10:22.497846 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 17 00:10:22 crc kubenswrapper[4791]: I0217 00:10:22.575952 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 17 00:10:22 crc kubenswrapper[4791]: I0217 00:10:22.607562 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 17 00:10:22 crc kubenswrapper[4791]: I0217 00:10:22.693296 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 17 00:10:22 crc kubenswrapper[4791]: I0217 00:10:22.710502 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 17 00:10:22 crc kubenswrapper[4791]: I0217 00:10:22.779415 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 17 00:10:22 crc kubenswrapper[4791]: I0217 00:10:22.827635 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 17 00:10:22 crc kubenswrapper[4791]: I0217 00:10:22.863248 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 17 00:10:22 crc kubenswrapper[4791]: I0217 00:10:22.882791 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 17 00:10:22 crc kubenswrapper[4791]: I0217 00:10:22.884258 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 17 00:10:23 crc kubenswrapper[4791]: I0217 00:10:23.053257 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 17 00:10:23 crc kubenswrapper[4791]: I0217 00:10:23.074141 4791 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 17 00:10:23 crc kubenswrapper[4791]: I0217 00:10:23.126366 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 17 00:10:23 crc kubenswrapper[4791]: I0217 00:10:23.134925 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 17 00:10:23 crc kubenswrapper[4791]: I0217 00:10:23.217020 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 17 00:10:23 crc kubenswrapper[4791]: I0217 00:10:23.273955 4791 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 17 00:10:23 crc kubenswrapper[4791]: I0217 00:10:23.554980 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 17 00:10:23 crc kubenswrapper[4791]: I0217 00:10:23.595351 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 17 00:10:23 crc kubenswrapper[4791]: I0217 00:10:23.659358 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 17 00:10:23 crc kubenswrapper[4791]: I0217 00:10:23.672570 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 17 00:10:23 crc kubenswrapper[4791]: I0217 00:10:23.828490 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 17 00:10:23 crc kubenswrapper[4791]: I0217 00:10:23.883497 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 17 00:10:23 crc kubenswrapper[4791]: I0217 00:10:23.970394 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 17 00:10:24 crc kubenswrapper[4791]: I0217 00:10:24.209487 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 17 00:10:24 crc kubenswrapper[4791]: I0217 00:10:24.476473 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 17 00:10:24 crc kubenswrapper[4791]: I0217 00:10:24.615371 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 17 00:10:24 crc kubenswrapper[4791]: I0217 00:10:24.623440 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 17 00:10:24 crc kubenswrapper[4791]: I0217 00:10:24.661727 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 17 00:10:24 crc kubenswrapper[4791]: I0217 00:10:24.946648 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.019506 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cgmd4"] Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.019729 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cgmd4" podUID="48855520-658c-4579-a867-7e984bce56c7" containerName="registry-server" containerID="cri-o://5c2e0aff4866650debb5175265a6d01c2a0e5b55e721f15c505fa68b2701ca01" gracePeriod=30 Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.035606 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8xbcp"] Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.035833 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8xbcp" podUID="f04d6e19-5c11-4527-8a49-3208098d2575" containerName="registry-server" containerID="cri-o://0221bbce827624c6e65868b10fe478e0cd26d718a850ba8cedf072c22010d1a5" gracePeriod=30 Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.047200 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bfffb"] Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.047519 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-bfffb" podUID="13a5be44-f180-42a9-bff7-8ba69cc589f0" containerName="marketplace-operator" containerID="cri-o://1fbad1f45455de495c7be98f092d45f87a92cd56249ef29cc851a38fc824a00f" gracePeriod=30 Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.059265 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h66xr"] Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.059571 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-h66xr" podUID="db1caaaf-7e8b-405c-97ff-7c507f068688" containerName="registry-server" containerID="cri-o://5d4c2a7bdc132569574b78294e0819b4803e57c50fba23d63fb08d7fe94edb8a" gracePeriod=30 Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.063773 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s76xp"] Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.064062 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-s76xp" podUID="6e6c03f6-847b-402c-bfde-6dd30870b907" containerName="registry-server" containerID="cri-o://33c952b3bdd673e3f38be6e9330aa80bd1a4cfda1479b24aa731cd17ec616cc1" gracePeriod=30 Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.093130 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-t8x7k"] Feb 17 00:10:25 crc kubenswrapper[4791]: E0217 00:10:25.093386 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45d65fd1-6366-4ed0-bc40-10d5418435ea" containerName="installer" Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.093404 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="45d65fd1-6366-4ed0-bc40-10d5418435ea" containerName="installer" Feb 17 00:10:25 crc kubenswrapper[4791]: E0217 00:10:25.093422 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.093432 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.093540 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="45d65fd1-6366-4ed0-bc40-10d5418435ea" containerName="installer" Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.093558 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.093999 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-t8x7k" Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.100621 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-t8x7k"] Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.291084 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/66e06ad0-6874-4a52-94d8-76da74f7336b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-t8x7k\" (UID: \"66e06ad0-6874-4a52-94d8-76da74f7336b\") " pod="openshift-marketplace/marketplace-operator-79b997595-t8x7k" Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.291171 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/66e06ad0-6874-4a52-94d8-76da74f7336b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-t8x7k\" (UID: \"66e06ad0-6874-4a52-94d8-76da74f7336b\") " pod="openshift-marketplace/marketplace-operator-79b997595-t8x7k" Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.291221 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kxfd\" (UniqueName: \"kubernetes.io/projected/66e06ad0-6874-4a52-94d8-76da74f7336b-kube-api-access-8kxfd\") pod \"marketplace-operator-79b997595-t8x7k\" (UID: \"66e06ad0-6874-4a52-94d8-76da74f7336b\") " pod="openshift-marketplace/marketplace-operator-79b997595-t8x7k" Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.394027 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/66e06ad0-6874-4a52-94d8-76da74f7336b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-t8x7k\" (UID: \"66e06ad0-6874-4a52-94d8-76da74f7336b\") " pod="openshift-marketplace/marketplace-operator-79b997595-t8x7k" Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.394194 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/66e06ad0-6874-4a52-94d8-76da74f7336b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-t8x7k\" (UID: \"66e06ad0-6874-4a52-94d8-76da74f7336b\") " pod="openshift-marketplace/marketplace-operator-79b997595-t8x7k" Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.394250 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kxfd\" (UniqueName: \"kubernetes.io/projected/66e06ad0-6874-4a52-94d8-76da74f7336b-kube-api-access-8kxfd\") pod \"marketplace-operator-79b997595-t8x7k\" (UID: \"66e06ad0-6874-4a52-94d8-76da74f7336b\") " pod="openshift-marketplace/marketplace-operator-79b997595-t8x7k" Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.396277 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/66e06ad0-6874-4a52-94d8-76da74f7336b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-t8x7k\" (UID: \"66e06ad0-6874-4a52-94d8-76da74f7336b\") " pod="openshift-marketplace/marketplace-operator-79b997595-t8x7k" Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.401715 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.403782 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/66e06ad0-6874-4a52-94d8-76da74f7336b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-t8x7k\" (UID: \"66e06ad0-6874-4a52-94d8-76da74f7336b\") " pod="openshift-marketplace/marketplace-operator-79b997595-t8x7k" Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.416043 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kxfd\" (UniqueName: \"kubernetes.io/projected/66e06ad0-6874-4a52-94d8-76da74f7336b-kube-api-access-8kxfd\") pod \"marketplace-operator-79b997595-t8x7k\" (UID: \"66e06ad0-6874-4a52-94d8-76da74f7336b\") " pod="openshift-marketplace/marketplace-operator-79b997595-t8x7k" Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.519234 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.539016 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-t8x7k" Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.584510 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.806131 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-t8x7k"] Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.921783 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s76xp" Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.962584 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bfffb" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.061041 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cgmd4" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.065824 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h66xr" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.100062 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8xbcp" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.105509 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2l6xj\" (UniqueName: \"kubernetes.io/projected/13a5be44-f180-42a9-bff7-8ba69cc589f0-kube-api-access-2l6xj\") pod \"13a5be44-f180-42a9-bff7-8ba69cc589f0\" (UID: \"13a5be44-f180-42a9-bff7-8ba69cc589f0\") " Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.105591 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e6c03f6-847b-402c-bfde-6dd30870b907-catalog-content\") pod \"6e6c03f6-847b-402c-bfde-6dd30870b907\" (UID: \"6e6c03f6-847b-402c-bfde-6dd30870b907\") " Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.105624 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e6c03f6-847b-402c-bfde-6dd30870b907-utilities\") pod \"6e6c03f6-847b-402c-bfde-6dd30870b907\" (UID: \"6e6c03f6-847b-402c-bfde-6dd30870b907\") " Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.105649 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/13a5be44-f180-42a9-bff7-8ba69cc589f0-marketplace-operator-metrics\") pod \"13a5be44-f180-42a9-bff7-8ba69cc589f0\" (UID: \"13a5be44-f180-42a9-bff7-8ba69cc589f0\") " Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.105725 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/13a5be44-f180-42a9-bff7-8ba69cc589f0-marketplace-trusted-ca\") pod \"13a5be44-f180-42a9-bff7-8ba69cc589f0\" (UID: \"13a5be44-f180-42a9-bff7-8ba69cc589f0\") " Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.105776 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6h6gj\" (UniqueName: \"kubernetes.io/projected/6e6c03f6-847b-402c-bfde-6dd30870b907-kube-api-access-6h6gj\") pod \"6e6c03f6-847b-402c-bfde-6dd30870b907\" (UID: \"6e6c03f6-847b-402c-bfde-6dd30870b907\") " Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.107379 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13a5be44-f180-42a9-bff7-8ba69cc589f0-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "13a5be44-f180-42a9-bff7-8ba69cc589f0" (UID: "13a5be44-f180-42a9-bff7-8ba69cc589f0"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.110665 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e6c03f6-847b-402c-bfde-6dd30870b907-utilities" (OuterVolumeSpecName: "utilities") pod "6e6c03f6-847b-402c-bfde-6dd30870b907" (UID: "6e6c03f6-847b-402c-bfde-6dd30870b907"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.112316 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e6c03f6-847b-402c-bfde-6dd30870b907-kube-api-access-6h6gj" (OuterVolumeSpecName: "kube-api-access-6h6gj") pod "6e6c03f6-847b-402c-bfde-6dd30870b907" (UID: "6e6c03f6-847b-402c-bfde-6dd30870b907"). InnerVolumeSpecName "kube-api-access-6h6gj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.112787 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13a5be44-f180-42a9-bff7-8ba69cc589f0-kube-api-access-2l6xj" (OuterVolumeSpecName: "kube-api-access-2l6xj") pod "13a5be44-f180-42a9-bff7-8ba69cc589f0" (UID: "13a5be44-f180-42a9-bff7-8ba69cc589f0"). InnerVolumeSpecName "kube-api-access-2l6xj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.125782 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13a5be44-f180-42a9-bff7-8ba69cc589f0-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "13a5be44-f180-42a9-bff7-8ba69cc589f0" (UID: "13a5be44-f180-42a9-bff7-8ba69cc589f0"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.180119 4791 generic.go:334] "Generic (PLEG): container finished" podID="6e6c03f6-847b-402c-bfde-6dd30870b907" containerID="33c952b3bdd673e3f38be6e9330aa80bd1a4cfda1479b24aa731cd17ec616cc1" exitCode=0 Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.180184 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s76xp" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.180188 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s76xp" event={"ID":"6e6c03f6-847b-402c-bfde-6dd30870b907","Type":"ContainerDied","Data":"33c952b3bdd673e3f38be6e9330aa80bd1a4cfda1479b24aa731cd17ec616cc1"} Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.180317 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s76xp" event={"ID":"6e6c03f6-847b-402c-bfde-6dd30870b907","Type":"ContainerDied","Data":"d67dd9afc803b2b8cf537c2d990677521dc5d35daf8d8201c4e1dd9fc3670d22"} Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.180363 4791 scope.go:117] "RemoveContainer" containerID="33c952b3bdd673e3f38be6e9330aa80bd1a4cfda1479b24aa731cd17ec616cc1" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.182211 4791 generic.go:334] "Generic (PLEG): container finished" podID="db1caaaf-7e8b-405c-97ff-7c507f068688" containerID="5d4c2a7bdc132569574b78294e0819b4803e57c50fba23d63fb08d7fe94edb8a" exitCode=0 Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.182283 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h66xr" event={"ID":"db1caaaf-7e8b-405c-97ff-7c507f068688","Type":"ContainerDied","Data":"5d4c2a7bdc132569574b78294e0819b4803e57c50fba23d63fb08d7fe94edb8a"} Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.182303 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h66xr" event={"ID":"db1caaaf-7e8b-405c-97ff-7c507f068688","Type":"ContainerDied","Data":"0a2777b10322faf11f31316ab253e0d88d658e157f8af01279ebdea911a277fa"} Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.182345 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h66xr" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.184730 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-t8x7k" event={"ID":"66e06ad0-6874-4a52-94d8-76da74f7336b","Type":"ContainerStarted","Data":"58eb500bf2f0a5e583661baccb8486b3a1f2366893b2ac87a31be78ba4c00230"} Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.184757 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-t8x7k" event={"ID":"66e06ad0-6874-4a52-94d8-76da74f7336b","Type":"ContainerStarted","Data":"d5e4395902c8c6c096ebff5df1fd9d99d8fbecbea5c749acb618b7dd081846bf"} Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.184983 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-t8x7k" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.186093 4791 generic.go:334] "Generic (PLEG): container finished" podID="13a5be44-f180-42a9-bff7-8ba69cc589f0" containerID="1fbad1f45455de495c7be98f092d45f87a92cd56249ef29cc851a38fc824a00f" exitCode=0 Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.186238 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bfffb" event={"ID":"13a5be44-f180-42a9-bff7-8ba69cc589f0","Type":"ContainerDied","Data":"1fbad1f45455de495c7be98f092d45f87a92cd56249ef29cc851a38fc824a00f"} Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.186256 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bfffb" event={"ID":"13a5be44-f180-42a9-bff7-8ba69cc589f0","Type":"ContainerDied","Data":"f1a3439d45cbb877a9cdb806affb8d5e0982a3ff436258b9fc60b97b89a3ef01"} Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.186296 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bfffb" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.187951 4791 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-t8x7k container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.58:8080/healthz\": dial tcp 10.217.0.58:8080: connect: connection refused" start-of-body= Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.187988 4791 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-t8x7k" podUID="66e06ad0-6874-4a52-94d8-76da74f7336b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.58:8080/healthz\": dial tcp 10.217.0.58:8080: connect: connection refused" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.189258 4791 generic.go:334] "Generic (PLEG): container finished" podID="48855520-658c-4579-a867-7e984bce56c7" containerID="5c2e0aff4866650debb5175265a6d01c2a0e5b55e721f15c505fa68b2701ca01" exitCode=0 Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.189331 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cgmd4" event={"ID":"48855520-658c-4579-a867-7e984bce56c7","Type":"ContainerDied","Data":"5c2e0aff4866650debb5175265a6d01c2a0e5b55e721f15c505fa68b2701ca01"} Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.189360 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cgmd4" event={"ID":"48855520-658c-4579-a867-7e984bce56c7","Type":"ContainerDied","Data":"f3a8bf4a4e4255984cba5a86035a408c84d7e84e14a3acd43f2d8aaf7ecd5cee"} Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.189426 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cgmd4" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.191521 4791 generic.go:334] "Generic (PLEG): container finished" podID="f04d6e19-5c11-4527-8a49-3208098d2575" containerID="0221bbce827624c6e65868b10fe478e0cd26d718a850ba8cedf072c22010d1a5" exitCode=0 Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.191557 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8xbcp" event={"ID":"f04d6e19-5c11-4527-8a49-3208098d2575","Type":"ContainerDied","Data":"0221bbce827624c6e65868b10fe478e0cd26d718a850ba8cedf072c22010d1a5"} Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.191578 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8xbcp" event={"ID":"f04d6e19-5c11-4527-8a49-3208098d2575","Type":"ContainerDied","Data":"8181b3c79ba3d257e580e3f1df6f57468b32e1bd3945a81c8cb4156646ea066f"} Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.191647 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8xbcp" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.201863 4791 scope.go:117] "RemoveContainer" containerID="f8f32a7e13221c829def6748e59d8da3ed3ad1b8cb9d0ac30deda15bc24545d5" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.206431 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f04d6e19-5c11-4527-8a49-3208098d2575-utilities\") pod \"f04d6e19-5c11-4527-8a49-3208098d2575\" (UID: \"f04d6e19-5c11-4527-8a49-3208098d2575\") " Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.206485 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2txwc\" (UniqueName: \"kubernetes.io/projected/db1caaaf-7e8b-405c-97ff-7c507f068688-kube-api-access-2txwc\") pod \"db1caaaf-7e8b-405c-97ff-7c507f068688\" (UID: \"db1caaaf-7e8b-405c-97ff-7c507f068688\") " Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.206514 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f04d6e19-5c11-4527-8a49-3208098d2575-catalog-content\") pod \"f04d6e19-5c11-4527-8a49-3208098d2575\" (UID: \"f04d6e19-5c11-4527-8a49-3208098d2575\") " Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.206548 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48855520-658c-4579-a867-7e984bce56c7-utilities\") pod \"48855520-658c-4579-a867-7e984bce56c7\" (UID: \"48855520-658c-4579-a867-7e984bce56c7\") " Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.206580 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blntw\" (UniqueName: \"kubernetes.io/projected/f04d6e19-5c11-4527-8a49-3208098d2575-kube-api-access-blntw\") pod \"f04d6e19-5c11-4527-8a49-3208098d2575\" (UID: \"f04d6e19-5c11-4527-8a49-3208098d2575\") " Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.206616 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db1caaaf-7e8b-405c-97ff-7c507f068688-catalog-content\") pod \"db1caaaf-7e8b-405c-97ff-7c507f068688\" (UID: \"db1caaaf-7e8b-405c-97ff-7c507f068688\") " Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.206659 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rq9sd\" (UniqueName: \"kubernetes.io/projected/48855520-658c-4579-a867-7e984bce56c7-kube-api-access-rq9sd\") pod \"48855520-658c-4579-a867-7e984bce56c7\" (UID: \"48855520-658c-4579-a867-7e984bce56c7\") " Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.206690 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48855520-658c-4579-a867-7e984bce56c7-catalog-content\") pod \"48855520-658c-4579-a867-7e984bce56c7\" (UID: \"48855520-658c-4579-a867-7e984bce56c7\") " Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.206789 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db1caaaf-7e8b-405c-97ff-7c507f068688-utilities\") pod \"db1caaaf-7e8b-405c-97ff-7c507f068688\" (UID: \"db1caaaf-7e8b-405c-97ff-7c507f068688\") " Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.207021 4791 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/13a5be44-f180-42a9-bff7-8ba69cc589f0-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.207042 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6h6gj\" (UniqueName: \"kubernetes.io/projected/6e6c03f6-847b-402c-bfde-6dd30870b907-kube-api-access-6h6gj\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.207054 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2l6xj\" (UniqueName: \"kubernetes.io/projected/13a5be44-f180-42a9-bff7-8ba69cc589f0-kube-api-access-2l6xj\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.207068 4791 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e6c03f6-847b-402c-bfde-6dd30870b907-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.207079 4791 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/13a5be44-f180-42a9-bff7-8ba69cc589f0-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.207842 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db1caaaf-7e8b-405c-97ff-7c507f068688-utilities" (OuterVolumeSpecName: "utilities") pod "db1caaaf-7e8b-405c-97ff-7c507f068688" (UID: "db1caaaf-7e8b-405c-97ff-7c507f068688"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.207945 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f04d6e19-5c11-4527-8a49-3208098d2575-utilities" (OuterVolumeSpecName: "utilities") pod "f04d6e19-5c11-4527-8a49-3208098d2575" (UID: "f04d6e19-5c11-4527-8a49-3208098d2575"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.208250 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48855520-658c-4579-a867-7e984bce56c7-utilities" (OuterVolumeSpecName: "utilities") pod "48855520-658c-4579-a867-7e984bce56c7" (UID: "48855520-658c-4579-a867-7e984bce56c7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.212022 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f04d6e19-5c11-4527-8a49-3208098d2575-kube-api-access-blntw" (OuterVolumeSpecName: "kube-api-access-blntw") pod "f04d6e19-5c11-4527-8a49-3208098d2575" (UID: "f04d6e19-5c11-4527-8a49-3208098d2575"). InnerVolumeSpecName "kube-api-access-blntw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.216817 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48855520-658c-4579-a867-7e984bce56c7-kube-api-access-rq9sd" (OuterVolumeSpecName: "kube-api-access-rq9sd") pod "48855520-658c-4579-a867-7e984bce56c7" (UID: "48855520-658c-4579-a867-7e984bce56c7"). InnerVolumeSpecName "kube-api-access-rq9sd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.238122 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-t8x7k" podStartSLOduration=1.238084751 podStartE2EDuration="1.238084751s" podCreationTimestamp="2026-02-17 00:10:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:10:26.213231063 +0000 UTC m=+283.692743580" watchObservedRunningTime="2026-02-17 00:10:26.238084751 +0000 UTC m=+283.717597278" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.238206 4791 scope.go:117] "RemoveContainer" containerID="ce53e59e96ea0a38e6dcc5c750617796d07e5b87c6becde6e266b92875a433c8" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.238559 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db1caaaf-7e8b-405c-97ff-7c507f068688-kube-api-access-2txwc" (OuterVolumeSpecName: "kube-api-access-2txwc") pod "db1caaaf-7e8b-405c-97ff-7c507f068688" (UID: "db1caaaf-7e8b-405c-97ff-7c507f068688"). InnerVolumeSpecName "kube-api-access-2txwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.244184 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bfffb"] Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.247862 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bfffb"] Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.259561 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db1caaaf-7e8b-405c-97ff-7c507f068688-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "db1caaaf-7e8b-405c-97ff-7c507f068688" (UID: "db1caaaf-7e8b-405c-97ff-7c507f068688"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.260847 4791 scope.go:117] "RemoveContainer" containerID="33c952b3bdd673e3f38be6e9330aa80bd1a4cfda1479b24aa731cd17ec616cc1" Feb 17 00:10:26 crc kubenswrapper[4791]: E0217 00:10:26.261260 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33c952b3bdd673e3f38be6e9330aa80bd1a4cfda1479b24aa731cd17ec616cc1\": container with ID starting with 33c952b3bdd673e3f38be6e9330aa80bd1a4cfda1479b24aa731cd17ec616cc1 not found: ID does not exist" containerID="33c952b3bdd673e3f38be6e9330aa80bd1a4cfda1479b24aa731cd17ec616cc1" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.261305 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33c952b3bdd673e3f38be6e9330aa80bd1a4cfda1479b24aa731cd17ec616cc1"} err="failed to get container status \"33c952b3bdd673e3f38be6e9330aa80bd1a4cfda1479b24aa731cd17ec616cc1\": rpc error: code = NotFound desc = could not find container \"33c952b3bdd673e3f38be6e9330aa80bd1a4cfda1479b24aa731cd17ec616cc1\": container with ID starting with 33c952b3bdd673e3f38be6e9330aa80bd1a4cfda1479b24aa731cd17ec616cc1 not found: ID does not exist" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.261332 4791 scope.go:117] "RemoveContainer" containerID="f8f32a7e13221c829def6748e59d8da3ed3ad1b8cb9d0ac30deda15bc24545d5" Feb 17 00:10:26 crc kubenswrapper[4791]: E0217 00:10:26.261970 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8f32a7e13221c829def6748e59d8da3ed3ad1b8cb9d0ac30deda15bc24545d5\": container with ID starting with f8f32a7e13221c829def6748e59d8da3ed3ad1b8cb9d0ac30deda15bc24545d5 not found: ID does not exist" containerID="f8f32a7e13221c829def6748e59d8da3ed3ad1b8cb9d0ac30deda15bc24545d5" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.261997 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8f32a7e13221c829def6748e59d8da3ed3ad1b8cb9d0ac30deda15bc24545d5"} err="failed to get container status \"f8f32a7e13221c829def6748e59d8da3ed3ad1b8cb9d0ac30deda15bc24545d5\": rpc error: code = NotFound desc = could not find container \"f8f32a7e13221c829def6748e59d8da3ed3ad1b8cb9d0ac30deda15bc24545d5\": container with ID starting with f8f32a7e13221c829def6748e59d8da3ed3ad1b8cb9d0ac30deda15bc24545d5 not found: ID does not exist" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.262018 4791 scope.go:117] "RemoveContainer" containerID="ce53e59e96ea0a38e6dcc5c750617796d07e5b87c6becde6e266b92875a433c8" Feb 17 00:10:26 crc kubenswrapper[4791]: E0217 00:10:26.262639 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce53e59e96ea0a38e6dcc5c750617796d07e5b87c6becde6e266b92875a433c8\": container with ID starting with ce53e59e96ea0a38e6dcc5c750617796d07e5b87c6becde6e266b92875a433c8 not found: ID does not exist" containerID="ce53e59e96ea0a38e6dcc5c750617796d07e5b87c6becde6e266b92875a433c8" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.263163 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce53e59e96ea0a38e6dcc5c750617796d07e5b87c6becde6e266b92875a433c8"} err="failed to get container status \"ce53e59e96ea0a38e6dcc5c750617796d07e5b87c6becde6e266b92875a433c8\": rpc error: code = NotFound desc = could not find container \"ce53e59e96ea0a38e6dcc5c750617796d07e5b87c6becde6e266b92875a433c8\": container with ID starting with ce53e59e96ea0a38e6dcc5c750617796d07e5b87c6becde6e266b92875a433c8 not found: ID does not exist" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.263198 4791 scope.go:117] "RemoveContainer" containerID="5d4c2a7bdc132569574b78294e0819b4803e57c50fba23d63fb08d7fe94edb8a" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.276230 4791 scope.go:117] "RemoveContainer" containerID="517005f22a3a08ad5b1e6c06b10a655495fa9186c8d9598bb0d8aad8f87d1943" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.295298 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e6c03f6-847b-402c-bfde-6dd30870b907-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6e6c03f6-847b-402c-bfde-6dd30870b907" (UID: "6e6c03f6-847b-402c-bfde-6dd30870b907"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.300224 4791 scope.go:117] "RemoveContainer" containerID="498836b2c2c73b292dcebe1de7c28ad97044adda878173f23061bacd92d69dfd" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.301947 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48855520-658c-4579-a867-7e984bce56c7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "48855520-658c-4579-a867-7e984bce56c7" (UID: "48855520-658c-4579-a867-7e984bce56c7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.308903 4791 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db1caaaf-7e8b-405c-97ff-7c507f068688-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.308935 4791 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f04d6e19-5c11-4527-8a49-3208098d2575-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.308945 4791 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e6c03f6-847b-402c-bfde-6dd30870b907-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.308955 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2txwc\" (UniqueName: \"kubernetes.io/projected/db1caaaf-7e8b-405c-97ff-7c507f068688-kube-api-access-2txwc\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.308980 4791 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48855520-658c-4579-a867-7e984bce56c7-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.308993 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blntw\" (UniqueName: \"kubernetes.io/projected/f04d6e19-5c11-4527-8a49-3208098d2575-kube-api-access-blntw\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.309002 4791 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db1caaaf-7e8b-405c-97ff-7c507f068688-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.309011 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rq9sd\" (UniqueName: \"kubernetes.io/projected/48855520-658c-4579-a867-7e984bce56c7-kube-api-access-rq9sd\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.309020 4791 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48855520-658c-4579-a867-7e984bce56c7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.310526 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f04d6e19-5c11-4527-8a49-3208098d2575-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f04d6e19-5c11-4527-8a49-3208098d2575" (UID: "f04d6e19-5c11-4527-8a49-3208098d2575"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.315504 4791 scope.go:117] "RemoveContainer" containerID="5d4c2a7bdc132569574b78294e0819b4803e57c50fba23d63fb08d7fe94edb8a" Feb 17 00:10:26 crc kubenswrapper[4791]: E0217 00:10:26.315985 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d4c2a7bdc132569574b78294e0819b4803e57c50fba23d63fb08d7fe94edb8a\": container with ID starting with 5d4c2a7bdc132569574b78294e0819b4803e57c50fba23d63fb08d7fe94edb8a not found: ID does not exist" containerID="5d4c2a7bdc132569574b78294e0819b4803e57c50fba23d63fb08d7fe94edb8a" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.316050 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d4c2a7bdc132569574b78294e0819b4803e57c50fba23d63fb08d7fe94edb8a"} err="failed to get container status \"5d4c2a7bdc132569574b78294e0819b4803e57c50fba23d63fb08d7fe94edb8a\": rpc error: code = NotFound desc = could not find container \"5d4c2a7bdc132569574b78294e0819b4803e57c50fba23d63fb08d7fe94edb8a\": container with ID starting with 5d4c2a7bdc132569574b78294e0819b4803e57c50fba23d63fb08d7fe94edb8a not found: ID does not exist" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.316092 4791 scope.go:117] "RemoveContainer" containerID="517005f22a3a08ad5b1e6c06b10a655495fa9186c8d9598bb0d8aad8f87d1943" Feb 17 00:10:26 crc kubenswrapper[4791]: E0217 00:10:26.318233 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"517005f22a3a08ad5b1e6c06b10a655495fa9186c8d9598bb0d8aad8f87d1943\": container with ID starting with 517005f22a3a08ad5b1e6c06b10a655495fa9186c8d9598bb0d8aad8f87d1943 not found: ID does not exist" containerID="517005f22a3a08ad5b1e6c06b10a655495fa9186c8d9598bb0d8aad8f87d1943" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.318321 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"517005f22a3a08ad5b1e6c06b10a655495fa9186c8d9598bb0d8aad8f87d1943"} err="failed to get container status \"517005f22a3a08ad5b1e6c06b10a655495fa9186c8d9598bb0d8aad8f87d1943\": rpc error: code = NotFound desc = could not find container \"517005f22a3a08ad5b1e6c06b10a655495fa9186c8d9598bb0d8aad8f87d1943\": container with ID starting with 517005f22a3a08ad5b1e6c06b10a655495fa9186c8d9598bb0d8aad8f87d1943 not found: ID does not exist" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.318365 4791 scope.go:117] "RemoveContainer" containerID="498836b2c2c73b292dcebe1de7c28ad97044adda878173f23061bacd92d69dfd" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.318450 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 17 00:10:26 crc kubenswrapper[4791]: E0217 00:10:26.318842 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"498836b2c2c73b292dcebe1de7c28ad97044adda878173f23061bacd92d69dfd\": container with ID starting with 498836b2c2c73b292dcebe1de7c28ad97044adda878173f23061bacd92d69dfd not found: ID does not exist" containerID="498836b2c2c73b292dcebe1de7c28ad97044adda878173f23061bacd92d69dfd" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.318875 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"498836b2c2c73b292dcebe1de7c28ad97044adda878173f23061bacd92d69dfd"} err="failed to get container status \"498836b2c2c73b292dcebe1de7c28ad97044adda878173f23061bacd92d69dfd\": rpc error: code = NotFound desc = could not find container \"498836b2c2c73b292dcebe1de7c28ad97044adda878173f23061bacd92d69dfd\": container with ID starting with 498836b2c2c73b292dcebe1de7c28ad97044adda878173f23061bacd92d69dfd not found: ID does not exist" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.318904 4791 scope.go:117] "RemoveContainer" containerID="1fbad1f45455de495c7be98f092d45f87a92cd56249ef29cc851a38fc824a00f" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.333857 4791 scope.go:117] "RemoveContainer" containerID="1fbad1f45455de495c7be98f092d45f87a92cd56249ef29cc851a38fc824a00f" Feb 17 00:10:26 crc kubenswrapper[4791]: E0217 00:10:26.334357 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fbad1f45455de495c7be98f092d45f87a92cd56249ef29cc851a38fc824a00f\": container with ID starting with 1fbad1f45455de495c7be98f092d45f87a92cd56249ef29cc851a38fc824a00f not found: ID does not exist" containerID="1fbad1f45455de495c7be98f092d45f87a92cd56249ef29cc851a38fc824a00f" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.334396 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fbad1f45455de495c7be98f092d45f87a92cd56249ef29cc851a38fc824a00f"} err="failed to get container status \"1fbad1f45455de495c7be98f092d45f87a92cd56249ef29cc851a38fc824a00f\": rpc error: code = NotFound desc = could not find container \"1fbad1f45455de495c7be98f092d45f87a92cd56249ef29cc851a38fc824a00f\": container with ID starting with 1fbad1f45455de495c7be98f092d45f87a92cd56249ef29cc851a38fc824a00f not found: ID does not exist" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.334429 4791 scope.go:117] "RemoveContainer" containerID="5c2e0aff4866650debb5175265a6d01c2a0e5b55e721f15c505fa68b2701ca01" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.353145 4791 scope.go:117] "RemoveContainer" containerID="0adb68d60017352d76c2f5cbf51ce6821a2b97f8be3e0ed37b97bdded9fc7e61" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.374949 4791 scope.go:117] "RemoveContainer" containerID="bbbf553ba1a4a0c0e10d191ff91b9ffb504aa8c57407304faeb4477b07b27e82" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.389051 4791 scope.go:117] "RemoveContainer" containerID="5c2e0aff4866650debb5175265a6d01c2a0e5b55e721f15c505fa68b2701ca01" Feb 17 00:10:26 crc kubenswrapper[4791]: E0217 00:10:26.390159 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c2e0aff4866650debb5175265a6d01c2a0e5b55e721f15c505fa68b2701ca01\": container with ID starting with 5c2e0aff4866650debb5175265a6d01c2a0e5b55e721f15c505fa68b2701ca01 not found: ID does not exist" containerID="5c2e0aff4866650debb5175265a6d01c2a0e5b55e721f15c505fa68b2701ca01" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.390202 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c2e0aff4866650debb5175265a6d01c2a0e5b55e721f15c505fa68b2701ca01"} err="failed to get container status \"5c2e0aff4866650debb5175265a6d01c2a0e5b55e721f15c505fa68b2701ca01\": rpc error: code = NotFound desc = could not find container \"5c2e0aff4866650debb5175265a6d01c2a0e5b55e721f15c505fa68b2701ca01\": container with ID starting with 5c2e0aff4866650debb5175265a6d01c2a0e5b55e721f15c505fa68b2701ca01 not found: ID does not exist" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.390232 4791 scope.go:117] "RemoveContainer" containerID="0adb68d60017352d76c2f5cbf51ce6821a2b97f8be3e0ed37b97bdded9fc7e61" Feb 17 00:10:26 crc kubenswrapper[4791]: E0217 00:10:26.391065 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0adb68d60017352d76c2f5cbf51ce6821a2b97f8be3e0ed37b97bdded9fc7e61\": container with ID starting with 0adb68d60017352d76c2f5cbf51ce6821a2b97f8be3e0ed37b97bdded9fc7e61 not found: ID does not exist" containerID="0adb68d60017352d76c2f5cbf51ce6821a2b97f8be3e0ed37b97bdded9fc7e61" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.391187 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0adb68d60017352d76c2f5cbf51ce6821a2b97f8be3e0ed37b97bdded9fc7e61"} err="failed to get container status \"0adb68d60017352d76c2f5cbf51ce6821a2b97f8be3e0ed37b97bdded9fc7e61\": rpc error: code = NotFound desc = could not find container \"0adb68d60017352d76c2f5cbf51ce6821a2b97f8be3e0ed37b97bdded9fc7e61\": container with ID starting with 0adb68d60017352d76c2f5cbf51ce6821a2b97f8be3e0ed37b97bdded9fc7e61 not found: ID does not exist" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.391293 4791 scope.go:117] "RemoveContainer" containerID="bbbf553ba1a4a0c0e10d191ff91b9ffb504aa8c57407304faeb4477b07b27e82" Feb 17 00:10:26 crc kubenswrapper[4791]: E0217 00:10:26.393766 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbbf553ba1a4a0c0e10d191ff91b9ffb504aa8c57407304faeb4477b07b27e82\": container with ID starting with bbbf553ba1a4a0c0e10d191ff91b9ffb504aa8c57407304faeb4477b07b27e82 not found: ID does not exist" containerID="bbbf553ba1a4a0c0e10d191ff91b9ffb504aa8c57407304faeb4477b07b27e82" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.393873 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbbf553ba1a4a0c0e10d191ff91b9ffb504aa8c57407304faeb4477b07b27e82"} err="failed to get container status \"bbbf553ba1a4a0c0e10d191ff91b9ffb504aa8c57407304faeb4477b07b27e82\": rpc error: code = NotFound desc = could not find container \"bbbf553ba1a4a0c0e10d191ff91b9ffb504aa8c57407304faeb4477b07b27e82\": container with ID starting with bbbf553ba1a4a0c0e10d191ff91b9ffb504aa8c57407304faeb4477b07b27e82 not found: ID does not exist" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.393956 4791 scope.go:117] "RemoveContainer" containerID="0221bbce827624c6e65868b10fe478e0cd26d718a850ba8cedf072c22010d1a5" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.408427 4791 scope.go:117] "RemoveContainer" containerID="2517c8370384ba83de2ad6cdce97d784d0d2c4299af3c5e90d0afd18716185e2" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.409640 4791 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f04d6e19-5c11-4527-8a49-3208098d2575-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.428645 4791 scope.go:117] "RemoveContainer" containerID="18e3187ecd2ce470123e0d8e986a332f912fb582355e7a34e0e65fa2416eee8c" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.449146 4791 scope.go:117] "RemoveContainer" containerID="0221bbce827624c6e65868b10fe478e0cd26d718a850ba8cedf072c22010d1a5" Feb 17 00:10:26 crc kubenswrapper[4791]: E0217 00:10:26.449649 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0221bbce827624c6e65868b10fe478e0cd26d718a850ba8cedf072c22010d1a5\": container with ID starting with 0221bbce827624c6e65868b10fe478e0cd26d718a850ba8cedf072c22010d1a5 not found: ID does not exist" containerID="0221bbce827624c6e65868b10fe478e0cd26d718a850ba8cedf072c22010d1a5" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.449694 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0221bbce827624c6e65868b10fe478e0cd26d718a850ba8cedf072c22010d1a5"} err="failed to get container status \"0221bbce827624c6e65868b10fe478e0cd26d718a850ba8cedf072c22010d1a5\": rpc error: code = NotFound desc = could not find container \"0221bbce827624c6e65868b10fe478e0cd26d718a850ba8cedf072c22010d1a5\": container with ID starting with 0221bbce827624c6e65868b10fe478e0cd26d718a850ba8cedf072c22010d1a5 not found: ID does not exist" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.449728 4791 scope.go:117] "RemoveContainer" containerID="2517c8370384ba83de2ad6cdce97d784d0d2c4299af3c5e90d0afd18716185e2" Feb 17 00:10:26 crc kubenswrapper[4791]: E0217 00:10:26.450138 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2517c8370384ba83de2ad6cdce97d784d0d2c4299af3c5e90d0afd18716185e2\": container with ID starting with 2517c8370384ba83de2ad6cdce97d784d0d2c4299af3c5e90d0afd18716185e2 not found: ID does not exist" containerID="2517c8370384ba83de2ad6cdce97d784d0d2c4299af3c5e90d0afd18716185e2" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.450170 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2517c8370384ba83de2ad6cdce97d784d0d2c4299af3c5e90d0afd18716185e2"} err="failed to get container status \"2517c8370384ba83de2ad6cdce97d784d0d2c4299af3c5e90d0afd18716185e2\": rpc error: code = NotFound desc = could not find container \"2517c8370384ba83de2ad6cdce97d784d0d2c4299af3c5e90d0afd18716185e2\": container with ID starting with 2517c8370384ba83de2ad6cdce97d784d0d2c4299af3c5e90d0afd18716185e2 not found: ID does not exist" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.450188 4791 scope.go:117] "RemoveContainer" containerID="18e3187ecd2ce470123e0d8e986a332f912fb582355e7a34e0e65fa2416eee8c" Feb 17 00:10:26 crc kubenswrapper[4791]: E0217 00:10:26.450588 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18e3187ecd2ce470123e0d8e986a332f912fb582355e7a34e0e65fa2416eee8c\": container with ID starting with 18e3187ecd2ce470123e0d8e986a332f912fb582355e7a34e0e65fa2416eee8c not found: ID does not exist" containerID="18e3187ecd2ce470123e0d8e986a332f912fb582355e7a34e0e65fa2416eee8c" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.450732 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18e3187ecd2ce470123e0d8e986a332f912fb582355e7a34e0e65fa2416eee8c"} err="failed to get container status \"18e3187ecd2ce470123e0d8e986a332f912fb582355e7a34e0e65fa2416eee8c\": rpc error: code = NotFound desc = could not find container \"18e3187ecd2ce470123e0d8e986a332f912fb582355e7a34e0e65fa2416eee8c\": container with ID starting with 18e3187ecd2ce470123e0d8e986a332f912fb582355e7a34e0e65fa2416eee8c not found: ID does not exist" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.570124 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cgmd4"] Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.575360 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cgmd4"] Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.578265 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h66xr"] Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.581260 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-h66xr"] Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.595580 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8xbcp"] Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.600397 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8xbcp"] Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.637414 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s76xp"] Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.660293 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-s76xp"] Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.892461 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.893252 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.019608 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.019744 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.019885 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.019967 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.020041 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.020012 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.020198 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.020309 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.020428 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.021590 4791 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.021720 4791 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.021794 4791 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.021861 4791 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.026408 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.123692 4791 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.206994 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.207182 4791 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="cd581d147e2a01123d1426c686ebf38ddd0bd82af7a233296466aaec95b07c9b" exitCode=137 Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.207499 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.207610 4791 scope.go:117] "RemoveContainer" containerID="cd581d147e2a01123d1426c686ebf38ddd0bd82af7a233296466aaec95b07c9b" Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.214097 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-t8x7k" Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.234585 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13a5be44-f180-42a9-bff7-8ba69cc589f0" path="/var/lib/kubelet/pods/13a5be44-f180-42a9-bff7-8ba69cc589f0/volumes" Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.235359 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48855520-658c-4579-a867-7e984bce56c7" path="/var/lib/kubelet/pods/48855520-658c-4579-a867-7e984bce56c7/volumes" Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.236403 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e6c03f6-847b-402c-bfde-6dd30870b907" path="/var/lib/kubelet/pods/6e6c03f6-847b-402c-bfde-6dd30870b907/volumes" Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.238268 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db1caaaf-7e8b-405c-97ff-7c507f068688" path="/var/lib/kubelet/pods/db1caaaf-7e8b-405c-97ff-7c507f068688/volumes" Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.239296 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f04d6e19-5c11-4527-8a49-3208098d2575" path="/var/lib/kubelet/pods/f04d6e19-5c11-4527-8a49-3208098d2575/volumes" Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.245552 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.251294 4791 scope.go:117] "RemoveContainer" containerID="cd581d147e2a01123d1426c686ebf38ddd0bd82af7a233296466aaec95b07c9b" Feb 17 00:10:27 crc kubenswrapper[4791]: E0217 00:10:27.253419 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd581d147e2a01123d1426c686ebf38ddd0bd82af7a233296466aaec95b07c9b\": container with ID starting with cd581d147e2a01123d1426c686ebf38ddd0bd82af7a233296466aaec95b07c9b not found: ID does not exist" containerID="cd581d147e2a01123d1426c686ebf38ddd0bd82af7a233296466aaec95b07c9b" Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.253484 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd581d147e2a01123d1426c686ebf38ddd0bd82af7a233296466aaec95b07c9b"} err="failed to get container status \"cd581d147e2a01123d1426c686ebf38ddd0bd82af7a233296466aaec95b07c9b\": rpc error: code = NotFound desc = could not find container \"cd581d147e2a01123d1426c686ebf38ddd0bd82af7a233296466aaec95b07c9b\": container with ID starting with cd581d147e2a01123d1426c686ebf38ddd0bd82af7a233296466aaec95b07c9b not found: ID does not exist" Feb 17 00:10:43 crc kubenswrapper[4791]: I0217 00:10:43.003273 4791 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.645304 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-npbnh"] Feb 17 00:10:47 crc kubenswrapper[4791]: E0217 00:10:47.645751 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f04d6e19-5c11-4527-8a49-3208098d2575" containerName="registry-server" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.645762 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="f04d6e19-5c11-4527-8a49-3208098d2575" containerName="registry-server" Feb 17 00:10:47 crc kubenswrapper[4791]: E0217 00:10:47.645772 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e6c03f6-847b-402c-bfde-6dd30870b907" containerName="extract-utilities" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.645777 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e6c03f6-847b-402c-bfde-6dd30870b907" containerName="extract-utilities" Feb 17 00:10:47 crc kubenswrapper[4791]: E0217 00:10:47.645787 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f04d6e19-5c11-4527-8a49-3208098d2575" containerName="extract-content" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.645794 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="f04d6e19-5c11-4527-8a49-3208098d2575" containerName="extract-content" Feb 17 00:10:47 crc kubenswrapper[4791]: E0217 00:10:47.645802 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db1caaaf-7e8b-405c-97ff-7c507f068688" containerName="extract-utilities" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.645807 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="db1caaaf-7e8b-405c-97ff-7c507f068688" containerName="extract-utilities" Feb 17 00:10:47 crc kubenswrapper[4791]: E0217 00:10:47.645814 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48855520-658c-4579-a867-7e984bce56c7" containerName="extract-utilities" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.645820 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="48855520-658c-4579-a867-7e984bce56c7" containerName="extract-utilities" Feb 17 00:10:47 crc kubenswrapper[4791]: E0217 00:10:47.645827 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e6c03f6-847b-402c-bfde-6dd30870b907" containerName="registry-server" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.645833 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e6c03f6-847b-402c-bfde-6dd30870b907" containerName="registry-server" Feb 17 00:10:47 crc kubenswrapper[4791]: E0217 00:10:47.645840 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e6c03f6-847b-402c-bfde-6dd30870b907" containerName="extract-content" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.645846 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e6c03f6-847b-402c-bfde-6dd30870b907" containerName="extract-content" Feb 17 00:10:47 crc kubenswrapper[4791]: E0217 00:10:47.645855 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f04d6e19-5c11-4527-8a49-3208098d2575" containerName="extract-utilities" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.645861 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="f04d6e19-5c11-4527-8a49-3208098d2575" containerName="extract-utilities" Feb 17 00:10:47 crc kubenswrapper[4791]: E0217 00:10:47.645868 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db1caaaf-7e8b-405c-97ff-7c507f068688" containerName="registry-server" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.645873 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="db1caaaf-7e8b-405c-97ff-7c507f068688" containerName="registry-server" Feb 17 00:10:47 crc kubenswrapper[4791]: E0217 00:10:47.645879 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48855520-658c-4579-a867-7e984bce56c7" containerName="registry-server" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.645885 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="48855520-658c-4579-a867-7e984bce56c7" containerName="registry-server" Feb 17 00:10:47 crc kubenswrapper[4791]: E0217 00:10:47.645894 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db1caaaf-7e8b-405c-97ff-7c507f068688" containerName="extract-content" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.645900 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="db1caaaf-7e8b-405c-97ff-7c507f068688" containerName="extract-content" Feb 17 00:10:47 crc kubenswrapper[4791]: E0217 00:10:47.645906 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48855520-658c-4579-a867-7e984bce56c7" containerName="extract-content" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.645912 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="48855520-658c-4579-a867-7e984bce56c7" containerName="extract-content" Feb 17 00:10:47 crc kubenswrapper[4791]: E0217 00:10:47.645922 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13a5be44-f180-42a9-bff7-8ba69cc589f0" containerName="marketplace-operator" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.645934 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="13a5be44-f180-42a9-bff7-8ba69cc589f0" containerName="marketplace-operator" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.646029 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="48855520-658c-4579-a867-7e984bce56c7" containerName="registry-server" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.646042 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="db1caaaf-7e8b-405c-97ff-7c507f068688" containerName="registry-server" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.646049 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="13a5be44-f180-42a9-bff7-8ba69cc589f0" containerName="marketplace-operator" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.646059 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e6c03f6-847b-402c-bfde-6dd30870b907" containerName="registry-server" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.646067 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="f04d6e19-5c11-4527-8a49-3208098d2575" containerName="registry-server" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.648169 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-npbnh" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.655388 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.667003 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-npbnh"] Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.792157 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6f055fb-42f4-4699-8dd3-d93710f92ec8-utilities\") pod \"certified-operators-npbnh\" (UID: \"c6f055fb-42f4-4699-8dd3-d93710f92ec8\") " pod="openshift-marketplace/certified-operators-npbnh" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.792238 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6f055fb-42f4-4699-8dd3-d93710f92ec8-catalog-content\") pod \"certified-operators-npbnh\" (UID: \"c6f055fb-42f4-4699-8dd3-d93710f92ec8\") " pod="openshift-marketplace/certified-operators-npbnh" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.792284 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5cnk\" (UniqueName: \"kubernetes.io/projected/c6f055fb-42f4-4699-8dd3-d93710f92ec8-kube-api-access-p5cnk\") pod \"certified-operators-npbnh\" (UID: \"c6f055fb-42f4-4699-8dd3-d93710f92ec8\") " pod="openshift-marketplace/certified-operators-npbnh" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.846700 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-v6qjq"] Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.847621 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v6qjq" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.849785 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.853039 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v6qjq"] Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.892882 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5cnk\" (UniqueName: \"kubernetes.io/projected/c6f055fb-42f4-4699-8dd3-d93710f92ec8-kube-api-access-p5cnk\") pod \"certified-operators-npbnh\" (UID: \"c6f055fb-42f4-4699-8dd3-d93710f92ec8\") " pod="openshift-marketplace/certified-operators-npbnh" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.892956 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6f055fb-42f4-4699-8dd3-d93710f92ec8-utilities\") pod \"certified-operators-npbnh\" (UID: \"c6f055fb-42f4-4699-8dd3-d93710f92ec8\") " pod="openshift-marketplace/certified-operators-npbnh" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.893143 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6f055fb-42f4-4699-8dd3-d93710f92ec8-catalog-content\") pod \"certified-operators-npbnh\" (UID: \"c6f055fb-42f4-4699-8dd3-d93710f92ec8\") " pod="openshift-marketplace/certified-operators-npbnh" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.893815 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6f055fb-42f4-4699-8dd3-d93710f92ec8-utilities\") pod \"certified-operators-npbnh\" (UID: \"c6f055fb-42f4-4699-8dd3-d93710f92ec8\") " pod="openshift-marketplace/certified-operators-npbnh" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.900701 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6f055fb-42f4-4699-8dd3-d93710f92ec8-catalog-content\") pod \"certified-operators-npbnh\" (UID: \"c6f055fb-42f4-4699-8dd3-d93710f92ec8\") " pod="openshift-marketplace/certified-operators-npbnh" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.914278 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5cnk\" (UniqueName: \"kubernetes.io/projected/c6f055fb-42f4-4699-8dd3-d93710f92ec8-kube-api-access-p5cnk\") pod \"certified-operators-npbnh\" (UID: \"c6f055fb-42f4-4699-8dd3-d93710f92ec8\") " pod="openshift-marketplace/certified-operators-npbnh" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.936047 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jftdn"] Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.936348 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" podUID="713c3460-f77d-4f7b-81bf-911f8f875dfe" containerName="controller-manager" containerID="cri-o://eccdaaec958face5d5dcfd6c5fddbc0b4b13a69d1c98503b7abb4bafa6bcd4bc" gracePeriod=30 Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.994637 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-729vq\" (UniqueName: \"kubernetes.io/projected/eebe5038-a970-42a4-81d4-fa84e6a64dd2-kube-api-access-729vq\") pod \"community-operators-v6qjq\" (UID: \"eebe5038-a970-42a4-81d4-fa84e6a64dd2\") " pod="openshift-marketplace/community-operators-v6qjq" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.994710 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eebe5038-a970-42a4-81d4-fa84e6a64dd2-catalog-content\") pod \"community-operators-v6qjq\" (UID: \"eebe5038-a970-42a4-81d4-fa84e6a64dd2\") " pod="openshift-marketplace/community-operators-v6qjq" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.994750 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eebe5038-a970-42a4-81d4-fa84e6a64dd2-utilities\") pod \"community-operators-v6qjq\" (UID: \"eebe5038-a970-42a4-81d4-fa84e6a64dd2\") " pod="openshift-marketplace/community-operators-v6qjq" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.005649 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-npbnh" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.045786 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455"] Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.046034 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" podUID="6a866a69-9159-4dd1-a03d-b2a0f703fb7b" containerName="route-controller-manager" containerID="cri-o://d077feb7a29e2e612d7324e3f4804db6415f0167d613f565b60c6d0f2bcce8f4" gracePeriod=30 Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.097934 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eebe5038-a970-42a4-81d4-fa84e6a64dd2-catalog-content\") pod \"community-operators-v6qjq\" (UID: \"eebe5038-a970-42a4-81d4-fa84e6a64dd2\") " pod="openshift-marketplace/community-operators-v6qjq" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.098176 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eebe5038-a970-42a4-81d4-fa84e6a64dd2-utilities\") pod \"community-operators-v6qjq\" (UID: \"eebe5038-a970-42a4-81d4-fa84e6a64dd2\") " pod="openshift-marketplace/community-operators-v6qjq" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.098225 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-729vq\" (UniqueName: \"kubernetes.io/projected/eebe5038-a970-42a4-81d4-fa84e6a64dd2-kube-api-access-729vq\") pod \"community-operators-v6qjq\" (UID: \"eebe5038-a970-42a4-81d4-fa84e6a64dd2\") " pod="openshift-marketplace/community-operators-v6qjq" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.099288 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eebe5038-a970-42a4-81d4-fa84e6a64dd2-utilities\") pod \"community-operators-v6qjq\" (UID: \"eebe5038-a970-42a4-81d4-fa84e6a64dd2\") " pod="openshift-marketplace/community-operators-v6qjq" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.099643 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eebe5038-a970-42a4-81d4-fa84e6a64dd2-catalog-content\") pod \"community-operators-v6qjq\" (UID: \"eebe5038-a970-42a4-81d4-fa84e6a64dd2\") " pod="openshift-marketplace/community-operators-v6qjq" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.127938 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-729vq\" (UniqueName: \"kubernetes.io/projected/eebe5038-a970-42a4-81d4-fa84e6a64dd2-kube-api-access-729vq\") pod \"community-operators-v6qjq\" (UID: \"eebe5038-a970-42a4-81d4-fa84e6a64dd2\") " pod="openshift-marketplace/community-operators-v6qjq" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.179964 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v6qjq" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.269866 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-npbnh"] Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.363669 4791 generic.go:334] "Generic (PLEG): container finished" podID="6a866a69-9159-4dd1-a03d-b2a0f703fb7b" containerID="d077feb7a29e2e612d7324e3f4804db6415f0167d613f565b60c6d0f2bcce8f4" exitCode=0 Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.363760 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" event={"ID":"6a866a69-9159-4dd1-a03d-b2a0f703fb7b","Type":"ContainerDied","Data":"d077feb7a29e2e612d7324e3f4804db6415f0167d613f565b60c6d0f2bcce8f4"} Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.365195 4791 generic.go:334] "Generic (PLEG): container finished" podID="713c3460-f77d-4f7b-81bf-911f8f875dfe" containerID="eccdaaec958face5d5dcfd6c5fddbc0b4b13a69d1c98503b7abb4bafa6bcd4bc" exitCode=0 Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.365263 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" event={"ID":"713c3460-f77d-4f7b-81bf-911f8f875dfe","Type":"ContainerDied","Data":"eccdaaec958face5d5dcfd6c5fddbc0b4b13a69d1c98503b7abb4bafa6bcd4bc"} Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.365288 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" event={"ID":"713c3460-f77d-4f7b-81bf-911f8f875dfe","Type":"ContainerDied","Data":"f303e766e2c893e279f8d6e69a4b7c3a7060f8cee57e4d09d9bd789c6c1a5750"} Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.365302 4791 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f303e766e2c893e279f8d6e69a4b7c3a7060f8cee57e4d09d9bd789c6c1a5750" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.366457 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-npbnh" event={"ID":"c6f055fb-42f4-4699-8dd3-d93710f92ec8","Type":"ContainerStarted","Data":"376fa6875549d425cfd29925fcb1e20160a0be88530f720a69bdfeec941be614"} Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.443926 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.481726 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.495061 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v6qjq"] Feb 17 00:10:48 crc kubenswrapper[4791]: W0217 00:10:48.502182 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeebe5038_a970_42a4_81d4_fa84e6a64dd2.slice/crio-c8fc013d2e008f0f951114c42fdaed6119c2933ea57de36f1dcfd1ae325e546c WatchSource:0}: Error finding container c8fc013d2e008f0f951114c42fdaed6119c2933ea57de36f1dcfd1ae325e546c: Status 404 returned error can't find the container with id c8fc013d2e008f0f951114c42fdaed6119c2933ea57de36f1dcfd1ae325e546c Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.602125 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/713c3460-f77d-4f7b-81bf-911f8f875dfe-proxy-ca-bundles\") pod \"713c3460-f77d-4f7b-81bf-911f8f875dfe\" (UID: \"713c3460-f77d-4f7b-81bf-911f8f875dfe\") " Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.602168 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/713c3460-f77d-4f7b-81bf-911f8f875dfe-serving-cert\") pod \"713c3460-f77d-4f7b-81bf-911f8f875dfe\" (UID: \"713c3460-f77d-4f7b-81bf-911f8f875dfe\") " Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.602187 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a866a69-9159-4dd1-a03d-b2a0f703fb7b-serving-cert\") pod \"6a866a69-9159-4dd1-a03d-b2a0f703fb7b\" (UID: \"6a866a69-9159-4dd1-a03d-b2a0f703fb7b\") " Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.602234 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6a866a69-9159-4dd1-a03d-b2a0f703fb7b-client-ca\") pod \"6a866a69-9159-4dd1-a03d-b2a0f703fb7b\" (UID: \"6a866a69-9159-4dd1-a03d-b2a0f703fb7b\") " Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.602261 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a866a69-9159-4dd1-a03d-b2a0f703fb7b-config\") pod \"6a866a69-9159-4dd1-a03d-b2a0f703fb7b\" (UID: \"6a866a69-9159-4dd1-a03d-b2a0f703fb7b\") " Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.602288 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdjvm\" (UniqueName: \"kubernetes.io/projected/6a866a69-9159-4dd1-a03d-b2a0f703fb7b-kube-api-access-pdjvm\") pod \"6a866a69-9159-4dd1-a03d-b2a0f703fb7b\" (UID: \"6a866a69-9159-4dd1-a03d-b2a0f703fb7b\") " Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.602329 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/713c3460-f77d-4f7b-81bf-911f8f875dfe-config\") pod \"713c3460-f77d-4f7b-81bf-911f8f875dfe\" (UID: \"713c3460-f77d-4f7b-81bf-911f8f875dfe\") " Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.602359 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8zlc\" (UniqueName: \"kubernetes.io/projected/713c3460-f77d-4f7b-81bf-911f8f875dfe-kube-api-access-k8zlc\") pod \"713c3460-f77d-4f7b-81bf-911f8f875dfe\" (UID: \"713c3460-f77d-4f7b-81bf-911f8f875dfe\") " Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.602375 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/713c3460-f77d-4f7b-81bf-911f8f875dfe-client-ca\") pod \"713c3460-f77d-4f7b-81bf-911f8f875dfe\" (UID: \"713c3460-f77d-4f7b-81bf-911f8f875dfe\") " Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.603101 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/713c3460-f77d-4f7b-81bf-911f8f875dfe-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "713c3460-f77d-4f7b-81bf-911f8f875dfe" (UID: "713c3460-f77d-4f7b-81bf-911f8f875dfe"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.603336 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/713c3460-f77d-4f7b-81bf-911f8f875dfe-config" (OuterVolumeSpecName: "config") pod "713c3460-f77d-4f7b-81bf-911f8f875dfe" (UID: "713c3460-f77d-4f7b-81bf-911f8f875dfe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.603623 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a866a69-9159-4dd1-a03d-b2a0f703fb7b-client-ca" (OuterVolumeSpecName: "client-ca") pod "6a866a69-9159-4dd1-a03d-b2a0f703fb7b" (UID: "6a866a69-9159-4dd1-a03d-b2a0f703fb7b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.603732 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a866a69-9159-4dd1-a03d-b2a0f703fb7b-config" (OuterVolumeSpecName: "config") pod "6a866a69-9159-4dd1-a03d-b2a0f703fb7b" (UID: "6a866a69-9159-4dd1-a03d-b2a0f703fb7b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.603925 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/713c3460-f77d-4f7b-81bf-911f8f875dfe-client-ca" (OuterVolumeSpecName: "client-ca") pod "713c3460-f77d-4f7b-81bf-911f8f875dfe" (UID: "713c3460-f77d-4f7b-81bf-911f8f875dfe"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.607866 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a866a69-9159-4dd1-a03d-b2a0f703fb7b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6a866a69-9159-4dd1-a03d-b2a0f703fb7b" (UID: "6a866a69-9159-4dd1-a03d-b2a0f703fb7b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.608486 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/713c3460-f77d-4f7b-81bf-911f8f875dfe-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "713c3460-f77d-4f7b-81bf-911f8f875dfe" (UID: "713c3460-f77d-4f7b-81bf-911f8f875dfe"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.608911 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a866a69-9159-4dd1-a03d-b2a0f703fb7b-kube-api-access-pdjvm" (OuterVolumeSpecName: "kube-api-access-pdjvm") pod "6a866a69-9159-4dd1-a03d-b2a0f703fb7b" (UID: "6a866a69-9159-4dd1-a03d-b2a0f703fb7b"). InnerVolumeSpecName "kube-api-access-pdjvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.609600 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/713c3460-f77d-4f7b-81bf-911f8f875dfe-kube-api-access-k8zlc" (OuterVolumeSpecName: "kube-api-access-k8zlc") pod "713c3460-f77d-4f7b-81bf-911f8f875dfe" (UID: "713c3460-f77d-4f7b-81bf-911f8f875dfe"). InnerVolumeSpecName "kube-api-access-k8zlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.704178 4791 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/713c3460-f77d-4f7b-81bf-911f8f875dfe-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.704218 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8zlc\" (UniqueName: \"kubernetes.io/projected/713c3460-f77d-4f7b-81bf-911f8f875dfe-kube-api-access-k8zlc\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.704234 4791 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/713c3460-f77d-4f7b-81bf-911f8f875dfe-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.704244 4791 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/713c3460-f77d-4f7b-81bf-911f8f875dfe-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.704252 4791 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/713c3460-f77d-4f7b-81bf-911f8f875dfe-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.704261 4791 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a866a69-9159-4dd1-a03d-b2a0f703fb7b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.704269 4791 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6a866a69-9159-4dd1-a03d-b2a0f703fb7b-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.704277 4791 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a866a69-9159-4dd1-a03d-b2a0f703fb7b-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.704288 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdjvm\" (UniqueName: \"kubernetes.io/projected/6a866a69-9159-4dd1-a03d-b2a0f703fb7b-kube-api-access-pdjvm\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.373682 4791 generic.go:334] "Generic (PLEG): container finished" podID="eebe5038-a970-42a4-81d4-fa84e6a64dd2" containerID="737af514e798ea92190e562f1ad92c8289073df535ff899a9da15a87c241f33d" exitCode=0 Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.373772 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v6qjq" event={"ID":"eebe5038-a970-42a4-81d4-fa84e6a64dd2","Type":"ContainerDied","Data":"737af514e798ea92190e562f1ad92c8289073df535ff899a9da15a87c241f33d"} Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.374021 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v6qjq" event={"ID":"eebe5038-a970-42a4-81d4-fa84e6a64dd2","Type":"ContainerStarted","Data":"c8fc013d2e008f0f951114c42fdaed6119c2933ea57de36f1dcfd1ae325e546c"} Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.376518 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" event={"ID":"6a866a69-9159-4dd1-a03d-b2a0f703fb7b","Type":"ContainerDied","Data":"0c85ff67a65174e4212f77cdeae113e56a44995c48f0f9d56c7ed9adda3bd480"} Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.376537 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.377142 4791 scope.go:117] "RemoveContainer" containerID="d077feb7a29e2e612d7324e3f4804db6415f0167d613f565b60c6d0f2bcce8f4" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.388019 4791 generic.go:334] "Generic (PLEG): container finished" podID="c6f055fb-42f4-4699-8dd3-d93710f92ec8" containerID="b8de34051f15482ce1813ccc5a2fddc7f5b160ddd741da04a7a7423df82e67ec" exitCode=0 Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.388138 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.388168 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-npbnh" event={"ID":"c6f055fb-42f4-4699-8dd3-d93710f92ec8","Type":"ContainerDied","Data":"b8de34051f15482ce1813ccc5a2fddc7f5b160ddd741da04a7a7423df82e67ec"} Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.422549 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455"] Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.424626 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455"] Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.430258 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jftdn"] Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.450848 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jftdn"] Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.669666 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps"] Feb 17 00:10:49 crc kubenswrapper[4791]: E0217 00:10:49.669996 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a866a69-9159-4dd1-a03d-b2a0f703fb7b" containerName="route-controller-manager" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.670014 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a866a69-9159-4dd1-a03d-b2a0f703fb7b" containerName="route-controller-manager" Feb 17 00:10:49 crc kubenswrapper[4791]: E0217 00:10:49.670031 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="713c3460-f77d-4f7b-81bf-911f8f875dfe" containerName="controller-manager" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.670041 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="713c3460-f77d-4f7b-81bf-911f8f875dfe" containerName="controller-manager" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.670267 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a866a69-9159-4dd1-a03d-b2a0f703fb7b" containerName="route-controller-manager" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.670298 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="713c3460-f77d-4f7b-81bf-911f8f875dfe" containerName="controller-manager" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.670731 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.672338 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w"] Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.672440 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.672945 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.673478 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.673979 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.674519 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.690358 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.691295 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.691629 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.692035 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.692773 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.692780 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.693891 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.695811 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.706245 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w"] Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.719355 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps"] Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.732926 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.817953 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcvm8\" (UniqueName: \"kubernetes.io/projected/717da635-adc5-4037-920f-c0bdec5fe8c2-kube-api-access-xcvm8\") pod \"route-controller-manager-76946b564d-qpgps\" (UID: \"717da635-adc5-4037-920f-c0bdec5fe8c2\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.818247 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/717da635-adc5-4037-920f-c0bdec5fe8c2-config\") pod \"route-controller-manager-76946b564d-qpgps\" (UID: \"717da635-adc5-4037-920f-c0bdec5fe8c2\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.818376 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-config\") pod \"controller-manager-67fbdd65b9-qjx4w\" (UID: \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\") " pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.818471 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/717da635-adc5-4037-920f-c0bdec5fe8c2-serving-cert\") pod \"route-controller-manager-76946b564d-qpgps\" (UID: \"717da635-adc5-4037-920f-c0bdec5fe8c2\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.818555 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-serving-cert\") pod \"controller-manager-67fbdd65b9-qjx4w\" (UID: \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\") " pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.818692 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-proxy-ca-bundles\") pod \"controller-manager-67fbdd65b9-qjx4w\" (UID: \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\") " pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.818793 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/717da635-adc5-4037-920f-c0bdec5fe8c2-client-ca\") pod \"route-controller-manager-76946b564d-qpgps\" (UID: \"717da635-adc5-4037-920f-c0bdec5fe8c2\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.818897 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-client-ca\") pod \"controller-manager-67fbdd65b9-qjx4w\" (UID: \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\") " pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.818946 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz76r\" (UniqueName: \"kubernetes.io/projected/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-kube-api-access-vz76r\") pod \"controller-manager-67fbdd65b9-qjx4w\" (UID: \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\") " pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.919890 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-proxy-ca-bundles\") pod \"controller-manager-67fbdd65b9-qjx4w\" (UID: \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\") " pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.919948 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/717da635-adc5-4037-920f-c0bdec5fe8c2-client-ca\") pod \"route-controller-manager-76946b564d-qpgps\" (UID: \"717da635-adc5-4037-920f-c0bdec5fe8c2\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.919987 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-client-ca\") pod \"controller-manager-67fbdd65b9-qjx4w\" (UID: \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\") " pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.920022 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vz76r\" (UniqueName: \"kubernetes.io/projected/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-kube-api-access-vz76r\") pod \"controller-manager-67fbdd65b9-qjx4w\" (UID: \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\") " pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.920057 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcvm8\" (UniqueName: \"kubernetes.io/projected/717da635-adc5-4037-920f-c0bdec5fe8c2-kube-api-access-xcvm8\") pod \"route-controller-manager-76946b564d-qpgps\" (UID: \"717da635-adc5-4037-920f-c0bdec5fe8c2\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.920090 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/717da635-adc5-4037-920f-c0bdec5fe8c2-config\") pod \"route-controller-manager-76946b564d-qpgps\" (UID: \"717da635-adc5-4037-920f-c0bdec5fe8c2\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.920166 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-config\") pod \"controller-manager-67fbdd65b9-qjx4w\" (UID: \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\") " pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.920189 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/717da635-adc5-4037-920f-c0bdec5fe8c2-serving-cert\") pod \"route-controller-manager-76946b564d-qpgps\" (UID: \"717da635-adc5-4037-920f-c0bdec5fe8c2\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.920221 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-serving-cert\") pod \"controller-manager-67fbdd65b9-qjx4w\" (UID: \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\") " pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.921014 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/717da635-adc5-4037-920f-c0bdec5fe8c2-client-ca\") pod \"route-controller-manager-76946b564d-qpgps\" (UID: \"717da635-adc5-4037-920f-c0bdec5fe8c2\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.921181 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-client-ca\") pod \"controller-manager-67fbdd65b9-qjx4w\" (UID: \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\") " pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.921597 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/717da635-adc5-4037-920f-c0bdec5fe8c2-config\") pod \"route-controller-manager-76946b564d-qpgps\" (UID: \"717da635-adc5-4037-920f-c0bdec5fe8c2\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.922014 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-config\") pod \"controller-manager-67fbdd65b9-qjx4w\" (UID: \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\") " pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.923008 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-proxy-ca-bundles\") pod \"controller-manager-67fbdd65b9-qjx4w\" (UID: \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\") " pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.925304 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/717da635-adc5-4037-920f-c0bdec5fe8c2-serving-cert\") pod \"route-controller-manager-76946b564d-qpgps\" (UID: \"717da635-adc5-4037-920f-c0bdec5fe8c2\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.926177 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-serving-cert\") pod \"controller-manager-67fbdd65b9-qjx4w\" (UID: \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\") " pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.938142 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz76r\" (UniqueName: \"kubernetes.io/projected/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-kube-api-access-vz76r\") pod \"controller-manager-67fbdd65b9-qjx4w\" (UID: \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\") " pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.944299 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcvm8\" (UniqueName: \"kubernetes.io/projected/717da635-adc5-4037-920f-c0bdec5fe8c2-kube-api-access-xcvm8\") pod \"route-controller-manager-76946b564d-qpgps\" (UID: \"717da635-adc5-4037-920f-c0bdec5fe8c2\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.998936 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps" Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.012976 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.249878 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lbgxw"] Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.251282 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lbgxw" Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.253470 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.257776 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lbgxw"] Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.324987 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz657\" (UniqueName: \"kubernetes.io/projected/ce989914-c2c6-4717-9acb-161dd734b4f6-kube-api-access-rz657\") pod \"redhat-marketplace-lbgxw\" (UID: \"ce989914-c2c6-4717-9acb-161dd734b4f6\") " pod="openshift-marketplace/redhat-marketplace-lbgxw" Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.325058 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce989914-c2c6-4717-9acb-161dd734b4f6-utilities\") pod \"redhat-marketplace-lbgxw\" (UID: \"ce989914-c2c6-4717-9acb-161dd734b4f6\") " pod="openshift-marketplace/redhat-marketplace-lbgxw" Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.325084 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce989914-c2c6-4717-9acb-161dd734b4f6-catalog-content\") pod \"redhat-marketplace-lbgxw\" (UID: \"ce989914-c2c6-4717-9acb-161dd734b4f6\") " pod="openshift-marketplace/redhat-marketplace-lbgxw" Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.396335 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v6qjq" event={"ID":"eebe5038-a970-42a4-81d4-fa84e6a64dd2","Type":"ContainerStarted","Data":"d69583cc2d298b1daf6617fbad84270f1593cb5a67be3d06e8883cb64b515f1c"} Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.397926 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-npbnh" event={"ID":"c6f055fb-42f4-4699-8dd3-d93710f92ec8","Type":"ContainerStarted","Data":"bc4c8ae0954bf18bfe8f02494a1b79ad9057ab43b8c261668a6653bea1617632"} Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.427199 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rz657\" (UniqueName: \"kubernetes.io/projected/ce989914-c2c6-4717-9acb-161dd734b4f6-kube-api-access-rz657\") pod \"redhat-marketplace-lbgxw\" (UID: \"ce989914-c2c6-4717-9acb-161dd734b4f6\") " pod="openshift-marketplace/redhat-marketplace-lbgxw" Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.427294 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce989914-c2c6-4717-9acb-161dd734b4f6-utilities\") pod \"redhat-marketplace-lbgxw\" (UID: \"ce989914-c2c6-4717-9acb-161dd734b4f6\") " pod="openshift-marketplace/redhat-marketplace-lbgxw" Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.427320 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce989914-c2c6-4717-9acb-161dd734b4f6-catalog-content\") pod \"redhat-marketplace-lbgxw\" (UID: \"ce989914-c2c6-4717-9acb-161dd734b4f6\") " pod="openshift-marketplace/redhat-marketplace-lbgxw" Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.427779 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce989914-c2c6-4717-9acb-161dd734b4f6-catalog-content\") pod \"redhat-marketplace-lbgxw\" (UID: \"ce989914-c2c6-4717-9acb-161dd734b4f6\") " pod="openshift-marketplace/redhat-marketplace-lbgxw" Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.427829 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce989914-c2c6-4717-9acb-161dd734b4f6-utilities\") pod \"redhat-marketplace-lbgxw\" (UID: \"ce989914-c2c6-4717-9acb-161dd734b4f6\") " pod="openshift-marketplace/redhat-marketplace-lbgxw" Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.449644 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sv4n6"] Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.451086 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sv4n6" Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.452689 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.453683 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sv4n6"] Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.462074 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz657\" (UniqueName: \"kubernetes.io/projected/ce989914-c2c6-4717-9acb-161dd734b4f6-kube-api-access-rz657\") pod \"redhat-marketplace-lbgxw\" (UID: \"ce989914-c2c6-4717-9acb-161dd734b4f6\") " pod="openshift-marketplace/redhat-marketplace-lbgxw" Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.503041 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w"] Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.529309 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx5m8\" (UniqueName: \"kubernetes.io/projected/f9f068a6-ed4e-4080-a05b-40562b5e8711-kube-api-access-xx5m8\") pod \"redhat-operators-sv4n6\" (UID: \"f9f068a6-ed4e-4080-a05b-40562b5e8711\") " pod="openshift-marketplace/redhat-operators-sv4n6" Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.529622 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9f068a6-ed4e-4080-a05b-40562b5e8711-catalog-content\") pod \"redhat-operators-sv4n6\" (UID: \"f9f068a6-ed4e-4080-a05b-40562b5e8711\") " pod="openshift-marketplace/redhat-operators-sv4n6" Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.529714 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9f068a6-ed4e-4080-a05b-40562b5e8711-utilities\") pod \"redhat-operators-sv4n6\" (UID: \"f9f068a6-ed4e-4080-a05b-40562b5e8711\") " pod="openshift-marketplace/redhat-operators-sv4n6" Feb 17 00:10:50 crc kubenswrapper[4791]: W0217 00:10:50.531038 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cd0f241_6d57_4bfa_a4f4_1e8a14005896.slice/crio-3bf4ab2409c10cd4b103249a3e25a8339b444092b7e3b6edfc5d0f89e181ada0 WatchSource:0}: Error finding container 3bf4ab2409c10cd4b103249a3e25a8339b444092b7e3b6edfc5d0f89e181ada0: Status 404 returned error can't find the container with id 3bf4ab2409c10cd4b103249a3e25a8339b444092b7e3b6edfc5d0f89e181ada0 Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.556527 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps"] Feb 17 00:10:50 crc kubenswrapper[4791]: W0217 00:10:50.563465 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod717da635_adc5_4037_920f_c0bdec5fe8c2.slice/crio-86de24ff9787ef37003f3a8da115935f7a0998ec6fbd5813a70c5d6002a2b45b WatchSource:0}: Error finding container 86de24ff9787ef37003f3a8da115935f7a0998ec6fbd5813a70c5d6002a2b45b: Status 404 returned error can't find the container with id 86de24ff9787ef37003f3a8da115935f7a0998ec6fbd5813a70c5d6002a2b45b Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.568430 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lbgxw" Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.630984 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9f068a6-ed4e-4080-a05b-40562b5e8711-utilities\") pod \"redhat-operators-sv4n6\" (UID: \"f9f068a6-ed4e-4080-a05b-40562b5e8711\") " pod="openshift-marketplace/redhat-operators-sv4n6" Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.631090 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx5m8\" (UniqueName: \"kubernetes.io/projected/f9f068a6-ed4e-4080-a05b-40562b5e8711-kube-api-access-xx5m8\") pod \"redhat-operators-sv4n6\" (UID: \"f9f068a6-ed4e-4080-a05b-40562b5e8711\") " pod="openshift-marketplace/redhat-operators-sv4n6" Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.631151 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9f068a6-ed4e-4080-a05b-40562b5e8711-catalog-content\") pod \"redhat-operators-sv4n6\" (UID: \"f9f068a6-ed4e-4080-a05b-40562b5e8711\") " pod="openshift-marketplace/redhat-operators-sv4n6" Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.631608 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9f068a6-ed4e-4080-a05b-40562b5e8711-utilities\") pod \"redhat-operators-sv4n6\" (UID: \"f9f068a6-ed4e-4080-a05b-40562b5e8711\") " pod="openshift-marketplace/redhat-operators-sv4n6" Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.631778 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9f068a6-ed4e-4080-a05b-40562b5e8711-catalog-content\") pod \"redhat-operators-sv4n6\" (UID: \"f9f068a6-ed4e-4080-a05b-40562b5e8711\") " pod="openshift-marketplace/redhat-operators-sv4n6" Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.662498 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx5m8\" (UniqueName: \"kubernetes.io/projected/f9f068a6-ed4e-4080-a05b-40562b5e8711-kube-api-access-xx5m8\") pod \"redhat-operators-sv4n6\" (UID: \"f9f068a6-ed4e-4080-a05b-40562b5e8711\") " pod="openshift-marketplace/redhat-operators-sv4n6" Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.767546 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lbgxw"] Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.767556 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sv4n6" Feb 17 00:10:51 crc kubenswrapper[4791]: I0217 00:10:51.227403 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a866a69-9159-4dd1-a03d-b2a0f703fb7b" path="/var/lib/kubelet/pods/6a866a69-9159-4dd1-a03d-b2a0f703fb7b/volumes" Feb 17 00:10:51 crc kubenswrapper[4791]: I0217 00:10:51.228407 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="713c3460-f77d-4f7b-81bf-911f8f875dfe" path="/var/lib/kubelet/pods/713c3460-f77d-4f7b-81bf-911f8f875dfe/volumes" Feb 17 00:10:51 crc kubenswrapper[4791]: I0217 00:10:51.231276 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sv4n6"] Feb 17 00:10:51 crc kubenswrapper[4791]: W0217 00:10:51.236238 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9f068a6_ed4e_4080_a05b_40562b5e8711.slice/crio-46832706ae2418858a2606eb0f704bbf45564ffef987194aac1dc7b9aea8cf5c WatchSource:0}: Error finding container 46832706ae2418858a2606eb0f704bbf45564ffef987194aac1dc7b9aea8cf5c: Status 404 returned error can't find the container with id 46832706ae2418858a2606eb0f704bbf45564ffef987194aac1dc7b9aea8cf5c Feb 17 00:10:51 crc kubenswrapper[4791]: I0217 00:10:51.403475 4791 generic.go:334] "Generic (PLEG): container finished" podID="c6f055fb-42f4-4699-8dd3-d93710f92ec8" containerID="bc4c8ae0954bf18bfe8f02494a1b79ad9057ab43b8c261668a6653bea1617632" exitCode=0 Feb 17 00:10:51 crc kubenswrapper[4791]: I0217 00:10:51.403533 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-npbnh" event={"ID":"c6f055fb-42f4-4699-8dd3-d93710f92ec8","Type":"ContainerDied","Data":"bc4c8ae0954bf18bfe8f02494a1b79ad9057ab43b8c261668a6653bea1617632"} Feb 17 00:10:51 crc kubenswrapper[4791]: I0217 00:10:51.408058 4791 generic.go:334] "Generic (PLEG): container finished" podID="ce989914-c2c6-4717-9acb-161dd734b4f6" containerID="3a4f0fbfdaf9197d1c655a275dbc9b73975f167cf4abd10a9fd33aab398c4195" exitCode=0 Feb 17 00:10:51 crc kubenswrapper[4791]: I0217 00:10:51.408250 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbgxw" event={"ID":"ce989914-c2c6-4717-9acb-161dd734b4f6","Type":"ContainerDied","Data":"3a4f0fbfdaf9197d1c655a275dbc9b73975f167cf4abd10a9fd33aab398c4195"} Feb 17 00:10:51 crc kubenswrapper[4791]: I0217 00:10:51.408282 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbgxw" event={"ID":"ce989914-c2c6-4717-9acb-161dd734b4f6","Type":"ContainerStarted","Data":"0c0c0ef37f45961765809fc7c0c9b4244d7c69f4387e7e7fe8ce8a7787ea122a"} Feb 17 00:10:51 crc kubenswrapper[4791]: I0217 00:10:51.416504 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" event={"ID":"6cd0f241-6d57-4bfa-a4f4-1e8a14005896","Type":"ContainerStarted","Data":"efcf75944c8224e17969ab0c4a98ab3fa49b1d194215b0d37268e6cff63187ea"} Feb 17 00:10:51 crc kubenswrapper[4791]: I0217 00:10:51.416549 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" event={"ID":"6cd0f241-6d57-4bfa-a4f4-1e8a14005896","Type":"ContainerStarted","Data":"3bf4ab2409c10cd4b103249a3e25a8339b444092b7e3b6edfc5d0f89e181ada0"} Feb 17 00:10:51 crc kubenswrapper[4791]: I0217 00:10:51.416910 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" Feb 17 00:10:51 crc kubenswrapper[4791]: I0217 00:10:51.422773 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps" event={"ID":"717da635-adc5-4037-920f-c0bdec5fe8c2","Type":"ContainerStarted","Data":"a09b93aa566280e4c2ff23fa2944ebddfee0fcbfb6b7c1ab281b7b4af49fa5c7"} Feb 17 00:10:51 crc kubenswrapper[4791]: I0217 00:10:51.422813 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps" event={"ID":"717da635-adc5-4037-920f-c0bdec5fe8c2","Type":"ContainerStarted","Data":"86de24ff9787ef37003f3a8da115935f7a0998ec6fbd5813a70c5d6002a2b45b"} Feb 17 00:10:51 crc kubenswrapper[4791]: I0217 00:10:51.423541 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps" Feb 17 00:10:51 crc kubenswrapper[4791]: I0217 00:10:51.432311 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" Feb 17 00:10:51 crc kubenswrapper[4791]: I0217 00:10:51.432352 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps" Feb 17 00:10:51 crc kubenswrapper[4791]: I0217 00:10:51.432880 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sv4n6" event={"ID":"f9f068a6-ed4e-4080-a05b-40562b5e8711","Type":"ContainerStarted","Data":"80cf4739f1986abc88ab217832093d35a7db6e5168ccfb1b73c108c3d63e7a8d"} Feb 17 00:10:51 crc kubenswrapper[4791]: I0217 00:10:51.432929 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sv4n6" event={"ID":"f9f068a6-ed4e-4080-a05b-40562b5e8711","Type":"ContainerStarted","Data":"46832706ae2418858a2606eb0f704bbf45564ffef987194aac1dc7b9aea8cf5c"} Feb 17 00:10:51 crc kubenswrapper[4791]: I0217 00:10:51.440729 4791 generic.go:334] "Generic (PLEG): container finished" podID="eebe5038-a970-42a4-81d4-fa84e6a64dd2" containerID="d69583cc2d298b1daf6617fbad84270f1593cb5a67be3d06e8883cb64b515f1c" exitCode=0 Feb 17 00:10:51 crc kubenswrapper[4791]: I0217 00:10:51.440772 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v6qjq" event={"ID":"eebe5038-a970-42a4-81d4-fa84e6a64dd2","Type":"ContainerDied","Data":"d69583cc2d298b1daf6617fbad84270f1593cb5a67be3d06e8883cb64b515f1c"} Feb 17 00:10:51 crc kubenswrapper[4791]: I0217 00:10:51.443527 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" podStartSLOduration=3.443505917 podStartE2EDuration="3.443505917s" podCreationTimestamp="2026-02-17 00:10:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:10:51.439898473 +0000 UTC m=+308.919411010" watchObservedRunningTime="2026-02-17 00:10:51.443505917 +0000 UTC m=+308.923018454" Feb 17 00:10:52 crc kubenswrapper[4791]: I0217 00:10:52.452125 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v6qjq" event={"ID":"eebe5038-a970-42a4-81d4-fa84e6a64dd2","Type":"ContainerStarted","Data":"ed7b094d4b0af5cbe89e9c6da70af29a27e1cfc9a98b0fd6d3269416ecfd81c1"} Feb 17 00:10:52 crc kubenswrapper[4791]: I0217 00:10:52.456375 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-npbnh" event={"ID":"c6f055fb-42f4-4699-8dd3-d93710f92ec8","Type":"ContainerStarted","Data":"97129438bd2b1068cd253e37059d602f7645f34520aa321c33c29138a41ff427"} Feb 17 00:10:52 crc kubenswrapper[4791]: I0217 00:10:52.458153 4791 generic.go:334] "Generic (PLEG): container finished" podID="ce989914-c2c6-4717-9acb-161dd734b4f6" containerID="e6388094dea6e5f3fc85b3f7c61dbf221cb823d36d986f1dd74b4e94b0c276b2" exitCode=0 Feb 17 00:10:52 crc kubenswrapper[4791]: I0217 00:10:52.458195 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbgxw" event={"ID":"ce989914-c2c6-4717-9acb-161dd734b4f6","Type":"ContainerDied","Data":"e6388094dea6e5f3fc85b3f7c61dbf221cb823d36d986f1dd74b4e94b0c276b2"} Feb 17 00:10:52 crc kubenswrapper[4791]: I0217 00:10:52.459899 4791 generic.go:334] "Generic (PLEG): container finished" podID="f9f068a6-ed4e-4080-a05b-40562b5e8711" containerID="80cf4739f1986abc88ab217832093d35a7db6e5168ccfb1b73c108c3d63e7a8d" exitCode=0 Feb 17 00:10:52 crc kubenswrapper[4791]: I0217 00:10:52.459936 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sv4n6" event={"ID":"f9f068a6-ed4e-4080-a05b-40562b5e8711","Type":"ContainerDied","Data":"80cf4739f1986abc88ab217832093d35a7db6e5168ccfb1b73c108c3d63e7a8d"} Feb 17 00:10:52 crc kubenswrapper[4791]: I0217 00:10:52.474432 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-v6qjq" podStartSLOduration=2.7530371970000003 podStartE2EDuration="5.474409459s" podCreationTimestamp="2026-02-17 00:10:47 +0000 UTC" firstStartedPulling="2026-02-17 00:10:49.375901161 +0000 UTC m=+306.855413708" lastFinishedPulling="2026-02-17 00:10:52.097273443 +0000 UTC m=+309.576785970" observedRunningTime="2026-02-17 00:10:52.47413018 +0000 UTC m=+309.953642717" watchObservedRunningTime="2026-02-17 00:10:52.474409459 +0000 UTC m=+309.953921986" Feb 17 00:10:52 crc kubenswrapper[4791]: I0217 00:10:52.475270 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps" podStartSLOduration=4.475262866 podStartE2EDuration="4.475262866s" podCreationTimestamp="2026-02-17 00:10:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:10:51.540928176 +0000 UTC m=+309.020440703" watchObservedRunningTime="2026-02-17 00:10:52.475262866 +0000 UTC m=+309.954775393" Feb 17 00:10:52 crc kubenswrapper[4791]: I0217 00:10:52.511170 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-npbnh" podStartSLOduration=2.893352321 podStartE2EDuration="5.51115428s" podCreationTimestamp="2026-02-17 00:10:47 +0000 UTC" firstStartedPulling="2026-02-17 00:10:49.39025399 +0000 UTC m=+306.869766517" lastFinishedPulling="2026-02-17 00:10:52.008055949 +0000 UTC m=+309.487568476" observedRunningTime="2026-02-17 00:10:52.510161328 +0000 UTC m=+309.989673875" watchObservedRunningTime="2026-02-17 00:10:52.51115428 +0000 UTC m=+309.990666797" Feb 17 00:10:53 crc kubenswrapper[4791]: I0217 00:10:53.465949 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbgxw" event={"ID":"ce989914-c2c6-4717-9acb-161dd734b4f6","Type":"ContainerStarted","Data":"c1899b35ee23af37b31e777d967d615f8c6498d7792dff26cf2154a7a42f8751"} Feb 17 00:10:53 crc kubenswrapper[4791]: I0217 00:10:53.467523 4791 generic.go:334] "Generic (PLEG): container finished" podID="f9f068a6-ed4e-4080-a05b-40562b5e8711" containerID="04145e421270586091691ca68902e3e565279b19ac1b4a2bc59f3b5979c68b29" exitCode=0 Feb 17 00:10:53 crc kubenswrapper[4791]: I0217 00:10:53.467649 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sv4n6" event={"ID":"f9f068a6-ed4e-4080-a05b-40562b5e8711","Type":"ContainerDied","Data":"04145e421270586091691ca68902e3e565279b19ac1b4a2bc59f3b5979c68b29"} Feb 17 00:10:53 crc kubenswrapper[4791]: I0217 00:10:53.499402 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lbgxw" podStartSLOduration=2.088926142 podStartE2EDuration="3.499372346s" podCreationTimestamp="2026-02-17 00:10:50 +0000 UTC" firstStartedPulling="2026-02-17 00:10:51.410876735 +0000 UTC m=+308.890389262" lastFinishedPulling="2026-02-17 00:10:52.821322939 +0000 UTC m=+310.300835466" observedRunningTime="2026-02-17 00:10:53.495941228 +0000 UTC m=+310.975453765" watchObservedRunningTime="2026-02-17 00:10:53.499372346 +0000 UTC m=+310.978884883" Feb 17 00:10:54 crc kubenswrapper[4791]: I0217 00:10:54.473971 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sv4n6" event={"ID":"f9f068a6-ed4e-4080-a05b-40562b5e8711","Type":"ContainerStarted","Data":"02221e817a7182ca546ec97737c6aac24abf44e4afca0417aa3e9df285e3c9e7"} Feb 17 00:10:58 crc kubenswrapper[4791]: I0217 00:10:58.007300 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-npbnh" Feb 17 00:10:58 crc kubenswrapper[4791]: I0217 00:10:58.007591 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-npbnh" Feb 17 00:10:58 crc kubenswrapper[4791]: I0217 00:10:58.052922 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-npbnh" Feb 17 00:10:58 crc kubenswrapper[4791]: I0217 00:10:58.074675 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sv4n6" podStartSLOduration=5.610781404 podStartE2EDuration="8.074660176s" podCreationTimestamp="2026-02-17 00:10:50 +0000 UTC" firstStartedPulling="2026-02-17 00:10:51.434305248 +0000 UTC m=+308.913817775" lastFinishedPulling="2026-02-17 00:10:53.89818402 +0000 UTC m=+311.377696547" observedRunningTime="2026-02-17 00:10:54.49455395 +0000 UTC m=+311.974066487" watchObservedRunningTime="2026-02-17 00:10:58.074660176 +0000 UTC m=+315.554172713" Feb 17 00:10:58 crc kubenswrapper[4791]: I0217 00:10:58.181147 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-v6qjq" Feb 17 00:10:58 crc kubenswrapper[4791]: I0217 00:10:58.181228 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-v6qjq" Feb 17 00:10:58 crc kubenswrapper[4791]: I0217 00:10:58.221784 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-v6qjq" Feb 17 00:10:58 crc kubenswrapper[4791]: I0217 00:10:58.532958 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-npbnh" Feb 17 00:10:58 crc kubenswrapper[4791]: I0217 00:10:58.555131 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-v6qjq" Feb 17 00:11:00 crc kubenswrapper[4791]: I0217 00:11:00.569607 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lbgxw" Feb 17 00:11:00 crc kubenswrapper[4791]: I0217 00:11:00.569651 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lbgxw" Feb 17 00:11:00 crc kubenswrapper[4791]: I0217 00:11:00.633466 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lbgxw" Feb 17 00:11:00 crc kubenswrapper[4791]: I0217 00:11:00.768147 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sv4n6" Feb 17 00:11:00 crc kubenswrapper[4791]: I0217 00:11:00.768220 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sv4n6" Feb 17 00:11:00 crc kubenswrapper[4791]: I0217 00:11:00.836497 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sv4n6" Feb 17 00:11:01 crc kubenswrapper[4791]: I0217 00:11:01.578659 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sv4n6" Feb 17 00:11:01 crc kubenswrapper[4791]: I0217 00:11:01.580833 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lbgxw" Feb 17 00:11:07 crc kubenswrapper[4791]: I0217 00:11:07.939801 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w"] Feb 17 00:11:07 crc kubenswrapper[4791]: I0217 00:11:07.941805 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" podUID="6cd0f241-6d57-4bfa-a4f4-1e8a14005896" containerName="controller-manager" containerID="cri-o://efcf75944c8224e17969ab0c4a98ab3fa49b1d194215b0d37268e6cff63187ea" gracePeriod=30 Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.446211 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.548336 4791 generic.go:334] "Generic (PLEG): container finished" podID="6cd0f241-6d57-4bfa-a4f4-1e8a14005896" containerID="efcf75944c8224e17969ab0c4a98ab3fa49b1d194215b0d37268e6cff63187ea" exitCode=0 Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.548382 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" event={"ID":"6cd0f241-6d57-4bfa-a4f4-1e8a14005896","Type":"ContainerDied","Data":"efcf75944c8224e17969ab0c4a98ab3fa49b1d194215b0d37268e6cff63187ea"} Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.548414 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" event={"ID":"6cd0f241-6d57-4bfa-a4f4-1e8a14005896","Type":"ContainerDied","Data":"3bf4ab2409c10cd4b103249a3e25a8339b444092b7e3b6edfc5d0f89e181ada0"} Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.548430 4791 scope.go:117] "RemoveContainer" containerID="efcf75944c8224e17969ab0c4a98ab3fa49b1d194215b0d37268e6cff63187ea" Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.548452 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.566618 4791 scope.go:117] "RemoveContainer" containerID="efcf75944c8224e17969ab0c4a98ab3fa49b1d194215b0d37268e6cff63187ea" Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.567904 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-serving-cert\") pod \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\" (UID: \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\") " Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.567955 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-proxy-ca-bundles\") pod \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\" (UID: \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\") " Feb 17 00:11:08 crc kubenswrapper[4791]: E0217 00:11:08.567953 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efcf75944c8224e17969ab0c4a98ab3fa49b1d194215b0d37268e6cff63187ea\": container with ID starting with efcf75944c8224e17969ab0c4a98ab3fa49b1d194215b0d37268e6cff63187ea not found: ID does not exist" containerID="efcf75944c8224e17969ab0c4a98ab3fa49b1d194215b0d37268e6cff63187ea" Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.568000 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-client-ca\") pod \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\" (UID: \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\") " Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.568000 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efcf75944c8224e17969ab0c4a98ab3fa49b1d194215b0d37268e6cff63187ea"} err="failed to get container status \"efcf75944c8224e17969ab0c4a98ab3fa49b1d194215b0d37268e6cff63187ea\": rpc error: code = NotFound desc = could not find container \"efcf75944c8224e17969ab0c4a98ab3fa49b1d194215b0d37268e6cff63187ea\": container with ID starting with efcf75944c8224e17969ab0c4a98ab3fa49b1d194215b0d37268e6cff63187ea not found: ID does not exist" Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.568085 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-config\") pod \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\" (UID: \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\") " Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.568164 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vz76r\" (UniqueName: \"kubernetes.io/projected/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-kube-api-access-vz76r\") pod \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\" (UID: \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\") " Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.569355 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "6cd0f241-6d57-4bfa-a4f4-1e8a14005896" (UID: "6cd0f241-6d57-4bfa-a4f4-1e8a14005896"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.569399 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-config" (OuterVolumeSpecName: "config") pod "6cd0f241-6d57-4bfa-a4f4-1e8a14005896" (UID: "6cd0f241-6d57-4bfa-a4f4-1e8a14005896"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.569431 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-client-ca" (OuterVolumeSpecName: "client-ca") pod "6cd0f241-6d57-4bfa-a4f4-1e8a14005896" (UID: "6cd0f241-6d57-4bfa-a4f4-1e8a14005896"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.577529 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-kube-api-access-vz76r" (OuterVolumeSpecName: "kube-api-access-vz76r") pod "6cd0f241-6d57-4bfa-a4f4-1e8a14005896" (UID: "6cd0f241-6d57-4bfa-a4f4-1e8a14005896"). InnerVolumeSpecName "kube-api-access-vz76r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.589489 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6cd0f241-6d57-4bfa-a4f4-1e8a14005896" (UID: "6cd0f241-6d57-4bfa-a4f4-1e8a14005896"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.669522 4791 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.669559 4791 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.669573 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vz76r\" (UniqueName: \"kubernetes.io/projected/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-kube-api-access-vz76r\") on node \"crc\" DevicePath \"\"" Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.669587 4791 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.669600 4791 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.879207 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w"] Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.887508 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w"] Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.233622 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cd0f241-6d57-4bfa-a4f4-1e8a14005896" path="/var/lib/kubelet/pods/6cd0f241-6d57-4bfa-a4f4-1e8a14005896/volumes" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.681880 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-568f8d896-cssls"] Feb 17 00:11:09 crc kubenswrapper[4791]: E0217 00:11:09.682195 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cd0f241-6d57-4bfa-a4f4-1e8a14005896" containerName="controller-manager" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.682217 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cd0f241-6d57-4bfa-a4f4-1e8a14005896" containerName="controller-manager" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.682342 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cd0f241-6d57-4bfa-a4f4-1e8a14005896" containerName="controller-manager" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.682843 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-568f8d896-cssls" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.685758 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.685859 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.686226 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.686274 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.686392 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.686408 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.694367 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-568f8d896-cssls"] Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.699275 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.783714 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzckm\" (UniqueName: \"kubernetes.io/projected/2f391307-5f7e-434c-b3a8-8a10278deaa7-kube-api-access-qzckm\") pod \"controller-manager-568f8d896-cssls\" (UID: \"2f391307-5f7e-434c-b3a8-8a10278deaa7\") " pod="openshift-controller-manager/controller-manager-568f8d896-cssls" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.783787 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f391307-5f7e-434c-b3a8-8a10278deaa7-config\") pod \"controller-manager-568f8d896-cssls\" (UID: \"2f391307-5f7e-434c-b3a8-8a10278deaa7\") " pod="openshift-controller-manager/controller-manager-568f8d896-cssls" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.783813 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f391307-5f7e-434c-b3a8-8a10278deaa7-serving-cert\") pod \"controller-manager-568f8d896-cssls\" (UID: \"2f391307-5f7e-434c-b3a8-8a10278deaa7\") " pod="openshift-controller-manager/controller-manager-568f8d896-cssls" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.783915 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f391307-5f7e-434c-b3a8-8a10278deaa7-client-ca\") pod \"controller-manager-568f8d896-cssls\" (UID: \"2f391307-5f7e-434c-b3a8-8a10278deaa7\") " pod="openshift-controller-manager/controller-manager-568f8d896-cssls" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.784039 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f391307-5f7e-434c-b3a8-8a10278deaa7-proxy-ca-bundles\") pod \"controller-manager-568f8d896-cssls\" (UID: \"2f391307-5f7e-434c-b3a8-8a10278deaa7\") " pod="openshift-controller-manager/controller-manager-568f8d896-cssls" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.885343 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f391307-5f7e-434c-b3a8-8a10278deaa7-config\") pod \"controller-manager-568f8d896-cssls\" (UID: \"2f391307-5f7e-434c-b3a8-8a10278deaa7\") " pod="openshift-controller-manager/controller-manager-568f8d896-cssls" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.885423 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f391307-5f7e-434c-b3a8-8a10278deaa7-serving-cert\") pod \"controller-manager-568f8d896-cssls\" (UID: \"2f391307-5f7e-434c-b3a8-8a10278deaa7\") " pod="openshift-controller-manager/controller-manager-568f8d896-cssls" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.885466 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f391307-5f7e-434c-b3a8-8a10278deaa7-client-ca\") pod \"controller-manager-568f8d896-cssls\" (UID: \"2f391307-5f7e-434c-b3a8-8a10278deaa7\") " pod="openshift-controller-manager/controller-manager-568f8d896-cssls" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.885569 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f391307-5f7e-434c-b3a8-8a10278deaa7-proxy-ca-bundles\") pod \"controller-manager-568f8d896-cssls\" (UID: \"2f391307-5f7e-434c-b3a8-8a10278deaa7\") " pod="openshift-controller-manager/controller-manager-568f8d896-cssls" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.885623 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzckm\" (UniqueName: \"kubernetes.io/projected/2f391307-5f7e-434c-b3a8-8a10278deaa7-kube-api-access-qzckm\") pod \"controller-manager-568f8d896-cssls\" (UID: \"2f391307-5f7e-434c-b3a8-8a10278deaa7\") " pod="openshift-controller-manager/controller-manager-568f8d896-cssls" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.887049 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f391307-5f7e-434c-b3a8-8a10278deaa7-client-ca\") pod \"controller-manager-568f8d896-cssls\" (UID: \"2f391307-5f7e-434c-b3a8-8a10278deaa7\") " pod="openshift-controller-manager/controller-manager-568f8d896-cssls" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.888541 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f391307-5f7e-434c-b3a8-8a10278deaa7-proxy-ca-bundles\") pod \"controller-manager-568f8d896-cssls\" (UID: \"2f391307-5f7e-434c-b3a8-8a10278deaa7\") " pod="openshift-controller-manager/controller-manager-568f8d896-cssls" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.888875 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f391307-5f7e-434c-b3a8-8a10278deaa7-config\") pod \"controller-manager-568f8d896-cssls\" (UID: \"2f391307-5f7e-434c-b3a8-8a10278deaa7\") " pod="openshift-controller-manager/controller-manager-568f8d896-cssls" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.893521 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f391307-5f7e-434c-b3a8-8a10278deaa7-serving-cert\") pod \"controller-manager-568f8d896-cssls\" (UID: \"2f391307-5f7e-434c-b3a8-8a10278deaa7\") " pod="openshift-controller-manager/controller-manager-568f8d896-cssls" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.907073 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzckm\" (UniqueName: \"kubernetes.io/projected/2f391307-5f7e-434c-b3a8-8a10278deaa7-kube-api-access-qzckm\") pod \"controller-manager-568f8d896-cssls\" (UID: \"2f391307-5f7e-434c-b3a8-8a10278deaa7\") " pod="openshift-controller-manager/controller-manager-568f8d896-cssls" Feb 17 00:11:10 crc kubenswrapper[4791]: I0217 00:11:10.001806 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-568f8d896-cssls" Feb 17 00:11:10 crc kubenswrapper[4791]: I0217 00:11:10.452302 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-568f8d896-cssls"] Feb 17 00:11:10 crc kubenswrapper[4791]: W0217 00:11:10.461199 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f391307_5f7e_434c_b3a8_8a10278deaa7.slice/crio-bb37b7d862e58b4555e4de812efccc8f94c99621ae054412c565f5e3958552d8 WatchSource:0}: Error finding container bb37b7d862e58b4555e4de812efccc8f94c99621ae054412c565f5e3958552d8: Status 404 returned error can't find the container with id bb37b7d862e58b4555e4de812efccc8f94c99621ae054412c565f5e3958552d8 Feb 17 00:11:10 crc kubenswrapper[4791]: I0217 00:11:10.563485 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-568f8d896-cssls" event={"ID":"2f391307-5f7e-434c-b3a8-8a10278deaa7","Type":"ContainerStarted","Data":"bb37b7d862e58b4555e4de812efccc8f94c99621ae054412c565f5e3958552d8"} Feb 17 00:11:11 crc kubenswrapper[4791]: I0217 00:11:11.568663 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-568f8d896-cssls" event={"ID":"2f391307-5f7e-434c-b3a8-8a10278deaa7","Type":"ContainerStarted","Data":"95069b9be18ff43096a939103e2b7b0e6f8b87a7fef8214af4141bce49ce0930"} Feb 17 00:11:11 crc kubenswrapper[4791]: I0217 00:11:11.568962 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-568f8d896-cssls" Feb 17 00:11:11 crc kubenswrapper[4791]: I0217 00:11:11.574183 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-568f8d896-cssls" Feb 17 00:11:11 crc kubenswrapper[4791]: I0217 00:11:11.591911 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-568f8d896-cssls" podStartSLOduration=4.591892723 podStartE2EDuration="4.591892723s" podCreationTimestamp="2026-02-17 00:11:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:11:11.590906034 +0000 UTC m=+329.070418571" watchObservedRunningTime="2026-02-17 00:11:11.591892723 +0000 UTC m=+329.071405250" Feb 17 00:11:27 crc kubenswrapper[4791]: I0217 00:11:27.926928 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps"] Feb 17 00:11:27 crc kubenswrapper[4791]: I0217 00:11:27.927694 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps" podUID="717da635-adc5-4037-920f-c0bdec5fe8c2" containerName="route-controller-manager" containerID="cri-o://a09b93aa566280e4c2ff23fa2944ebddfee0fcbfb6b7c1ab281b7b4af49fa5c7" gracePeriod=30 Feb 17 00:11:28 crc kubenswrapper[4791]: I0217 00:11:28.350528 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps" Feb 17 00:11:28 crc kubenswrapper[4791]: I0217 00:11:28.431647 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcvm8\" (UniqueName: \"kubernetes.io/projected/717da635-adc5-4037-920f-c0bdec5fe8c2-kube-api-access-xcvm8\") pod \"717da635-adc5-4037-920f-c0bdec5fe8c2\" (UID: \"717da635-adc5-4037-920f-c0bdec5fe8c2\") " Feb 17 00:11:28 crc kubenswrapper[4791]: I0217 00:11:28.431728 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/717da635-adc5-4037-920f-c0bdec5fe8c2-client-ca\") pod \"717da635-adc5-4037-920f-c0bdec5fe8c2\" (UID: \"717da635-adc5-4037-920f-c0bdec5fe8c2\") " Feb 17 00:11:28 crc kubenswrapper[4791]: I0217 00:11:28.431770 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/717da635-adc5-4037-920f-c0bdec5fe8c2-config\") pod \"717da635-adc5-4037-920f-c0bdec5fe8c2\" (UID: \"717da635-adc5-4037-920f-c0bdec5fe8c2\") " Feb 17 00:11:28 crc kubenswrapper[4791]: I0217 00:11:28.431797 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/717da635-adc5-4037-920f-c0bdec5fe8c2-serving-cert\") pod \"717da635-adc5-4037-920f-c0bdec5fe8c2\" (UID: \"717da635-adc5-4037-920f-c0bdec5fe8c2\") " Feb 17 00:11:28 crc kubenswrapper[4791]: I0217 00:11:28.432439 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/717da635-adc5-4037-920f-c0bdec5fe8c2-client-ca" (OuterVolumeSpecName: "client-ca") pod "717da635-adc5-4037-920f-c0bdec5fe8c2" (UID: "717da635-adc5-4037-920f-c0bdec5fe8c2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:11:28 crc kubenswrapper[4791]: I0217 00:11:28.432627 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/717da635-adc5-4037-920f-c0bdec5fe8c2-config" (OuterVolumeSpecName: "config") pod "717da635-adc5-4037-920f-c0bdec5fe8c2" (UID: "717da635-adc5-4037-920f-c0bdec5fe8c2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:11:28 crc kubenswrapper[4791]: I0217 00:11:28.437474 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/717da635-adc5-4037-920f-c0bdec5fe8c2-kube-api-access-xcvm8" (OuterVolumeSpecName: "kube-api-access-xcvm8") pod "717da635-adc5-4037-920f-c0bdec5fe8c2" (UID: "717da635-adc5-4037-920f-c0bdec5fe8c2"). InnerVolumeSpecName "kube-api-access-xcvm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:11:28 crc kubenswrapper[4791]: I0217 00:11:28.445223 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/717da635-adc5-4037-920f-c0bdec5fe8c2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "717da635-adc5-4037-920f-c0bdec5fe8c2" (UID: "717da635-adc5-4037-920f-c0bdec5fe8c2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:11:28 crc kubenswrapper[4791]: I0217 00:11:28.532640 4791 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/717da635-adc5-4037-920f-c0bdec5fe8c2-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:11:28 crc kubenswrapper[4791]: I0217 00:11:28.532683 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcvm8\" (UniqueName: \"kubernetes.io/projected/717da635-adc5-4037-920f-c0bdec5fe8c2-kube-api-access-xcvm8\") on node \"crc\" DevicePath \"\"" Feb 17 00:11:28 crc kubenswrapper[4791]: I0217 00:11:28.532696 4791 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/717da635-adc5-4037-920f-c0bdec5fe8c2-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 00:11:28 crc kubenswrapper[4791]: I0217 00:11:28.532706 4791 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/717da635-adc5-4037-920f-c0bdec5fe8c2-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:11:28 crc kubenswrapper[4791]: I0217 00:11:28.672332 4791 generic.go:334] "Generic (PLEG): container finished" podID="717da635-adc5-4037-920f-c0bdec5fe8c2" containerID="a09b93aa566280e4c2ff23fa2944ebddfee0fcbfb6b7c1ab281b7b4af49fa5c7" exitCode=0 Feb 17 00:11:28 crc kubenswrapper[4791]: I0217 00:11:28.672381 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps" Feb 17 00:11:28 crc kubenswrapper[4791]: I0217 00:11:28.672394 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps" event={"ID":"717da635-adc5-4037-920f-c0bdec5fe8c2","Type":"ContainerDied","Data":"a09b93aa566280e4c2ff23fa2944ebddfee0fcbfb6b7c1ab281b7b4af49fa5c7"} Feb 17 00:11:28 crc kubenswrapper[4791]: I0217 00:11:28.672439 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps" event={"ID":"717da635-adc5-4037-920f-c0bdec5fe8c2","Type":"ContainerDied","Data":"86de24ff9787ef37003f3a8da115935f7a0998ec6fbd5813a70c5d6002a2b45b"} Feb 17 00:11:28 crc kubenswrapper[4791]: I0217 00:11:28.672471 4791 scope.go:117] "RemoveContainer" containerID="a09b93aa566280e4c2ff23fa2944ebddfee0fcbfb6b7c1ab281b7b4af49fa5c7" Feb 17 00:11:28 crc kubenswrapper[4791]: I0217 00:11:28.693439 4791 scope.go:117] "RemoveContainer" containerID="a09b93aa566280e4c2ff23fa2944ebddfee0fcbfb6b7c1ab281b7b4af49fa5c7" Feb 17 00:11:28 crc kubenswrapper[4791]: E0217 00:11:28.697030 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a09b93aa566280e4c2ff23fa2944ebddfee0fcbfb6b7c1ab281b7b4af49fa5c7\": container with ID starting with a09b93aa566280e4c2ff23fa2944ebddfee0fcbfb6b7c1ab281b7b4af49fa5c7 not found: ID does not exist" containerID="a09b93aa566280e4c2ff23fa2944ebddfee0fcbfb6b7c1ab281b7b4af49fa5c7" Feb 17 00:11:28 crc kubenswrapper[4791]: I0217 00:11:28.697083 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a09b93aa566280e4c2ff23fa2944ebddfee0fcbfb6b7c1ab281b7b4af49fa5c7"} err="failed to get container status \"a09b93aa566280e4c2ff23fa2944ebddfee0fcbfb6b7c1ab281b7b4af49fa5c7\": rpc error: code = NotFound desc = could not find container \"a09b93aa566280e4c2ff23fa2944ebddfee0fcbfb6b7c1ab281b7b4af49fa5c7\": container with ID starting with a09b93aa566280e4c2ff23fa2944ebddfee0fcbfb6b7c1ab281b7b4af49fa5c7 not found: ID does not exist" Feb 17 00:11:28 crc kubenswrapper[4791]: I0217 00:11:28.707603 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps"] Feb 17 00:11:28 crc kubenswrapper[4791]: I0217 00:11:28.710396 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps"] Feb 17 00:11:29 crc kubenswrapper[4791]: I0217 00:11:29.230788 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="717da635-adc5-4037-920f-c0bdec5fe8c2" path="/var/lib/kubelet/pods/717da635-adc5-4037-920f-c0bdec5fe8c2/volumes" Feb 17 00:11:29 crc kubenswrapper[4791]: I0217 00:11:29.692768 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c5f564774-2dv56"] Feb 17 00:11:29 crc kubenswrapper[4791]: E0217 00:11:29.693437 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="717da635-adc5-4037-920f-c0bdec5fe8c2" containerName="route-controller-manager" Feb 17 00:11:29 crc kubenswrapper[4791]: I0217 00:11:29.693458 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="717da635-adc5-4037-920f-c0bdec5fe8c2" containerName="route-controller-manager" Feb 17 00:11:29 crc kubenswrapper[4791]: I0217 00:11:29.693678 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="717da635-adc5-4037-920f-c0bdec5fe8c2" containerName="route-controller-manager" Feb 17 00:11:29 crc kubenswrapper[4791]: I0217 00:11:29.694467 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c5f564774-2dv56" Feb 17 00:11:29 crc kubenswrapper[4791]: I0217 00:11:29.698297 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 17 00:11:29 crc kubenswrapper[4791]: I0217 00:11:29.699010 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 17 00:11:29 crc kubenswrapper[4791]: I0217 00:11:29.699903 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 17 00:11:29 crc kubenswrapper[4791]: I0217 00:11:29.700325 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 17 00:11:29 crc kubenswrapper[4791]: I0217 00:11:29.705688 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 17 00:11:29 crc kubenswrapper[4791]: I0217 00:11:29.706281 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 17 00:11:29 crc kubenswrapper[4791]: I0217 00:11:29.714470 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c5f564774-2dv56"] Feb 17 00:11:29 crc kubenswrapper[4791]: I0217 00:11:29.748393 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8021891-e951-4ce4-bfc5-22c78ac8d0c2-serving-cert\") pod \"route-controller-manager-6c5f564774-2dv56\" (UID: \"e8021891-e951-4ce4-bfc5-22c78ac8d0c2\") " pod="openshift-route-controller-manager/route-controller-manager-6c5f564774-2dv56" Feb 17 00:11:29 crc kubenswrapper[4791]: I0217 00:11:29.748747 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8021891-e951-4ce4-bfc5-22c78ac8d0c2-config\") pod \"route-controller-manager-6c5f564774-2dv56\" (UID: \"e8021891-e951-4ce4-bfc5-22c78ac8d0c2\") " pod="openshift-route-controller-manager/route-controller-manager-6c5f564774-2dv56" Feb 17 00:11:29 crc kubenswrapper[4791]: I0217 00:11:29.748918 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e8021891-e951-4ce4-bfc5-22c78ac8d0c2-client-ca\") pod \"route-controller-manager-6c5f564774-2dv56\" (UID: \"e8021891-e951-4ce4-bfc5-22c78ac8d0c2\") " pod="openshift-route-controller-manager/route-controller-manager-6c5f564774-2dv56" Feb 17 00:11:29 crc kubenswrapper[4791]: I0217 00:11:29.749033 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcq9b\" (UniqueName: \"kubernetes.io/projected/e8021891-e951-4ce4-bfc5-22c78ac8d0c2-kube-api-access-qcq9b\") pod \"route-controller-manager-6c5f564774-2dv56\" (UID: \"e8021891-e951-4ce4-bfc5-22c78ac8d0c2\") " pod="openshift-route-controller-manager/route-controller-manager-6c5f564774-2dv56" Feb 17 00:11:29 crc kubenswrapper[4791]: I0217 00:11:29.850698 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8021891-e951-4ce4-bfc5-22c78ac8d0c2-serving-cert\") pod \"route-controller-manager-6c5f564774-2dv56\" (UID: \"e8021891-e951-4ce4-bfc5-22c78ac8d0c2\") " pod="openshift-route-controller-manager/route-controller-manager-6c5f564774-2dv56" Feb 17 00:11:29 crc kubenswrapper[4791]: I0217 00:11:29.850809 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8021891-e951-4ce4-bfc5-22c78ac8d0c2-config\") pod \"route-controller-manager-6c5f564774-2dv56\" (UID: \"e8021891-e951-4ce4-bfc5-22c78ac8d0c2\") " pod="openshift-route-controller-manager/route-controller-manager-6c5f564774-2dv56" Feb 17 00:11:29 crc kubenswrapper[4791]: I0217 00:11:29.850894 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e8021891-e951-4ce4-bfc5-22c78ac8d0c2-client-ca\") pod \"route-controller-manager-6c5f564774-2dv56\" (UID: \"e8021891-e951-4ce4-bfc5-22c78ac8d0c2\") " pod="openshift-route-controller-manager/route-controller-manager-6c5f564774-2dv56" Feb 17 00:11:29 crc kubenswrapper[4791]: I0217 00:11:29.850931 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcq9b\" (UniqueName: \"kubernetes.io/projected/e8021891-e951-4ce4-bfc5-22c78ac8d0c2-kube-api-access-qcq9b\") pod \"route-controller-manager-6c5f564774-2dv56\" (UID: \"e8021891-e951-4ce4-bfc5-22c78ac8d0c2\") " pod="openshift-route-controller-manager/route-controller-manager-6c5f564774-2dv56" Feb 17 00:11:29 crc kubenswrapper[4791]: I0217 00:11:29.853139 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e8021891-e951-4ce4-bfc5-22c78ac8d0c2-client-ca\") pod \"route-controller-manager-6c5f564774-2dv56\" (UID: \"e8021891-e951-4ce4-bfc5-22c78ac8d0c2\") " pod="openshift-route-controller-manager/route-controller-manager-6c5f564774-2dv56" Feb 17 00:11:29 crc kubenswrapper[4791]: I0217 00:11:29.853810 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8021891-e951-4ce4-bfc5-22c78ac8d0c2-config\") pod \"route-controller-manager-6c5f564774-2dv56\" (UID: \"e8021891-e951-4ce4-bfc5-22c78ac8d0c2\") " pod="openshift-route-controller-manager/route-controller-manager-6c5f564774-2dv56" Feb 17 00:11:29 crc kubenswrapper[4791]: I0217 00:11:29.856644 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8021891-e951-4ce4-bfc5-22c78ac8d0c2-serving-cert\") pod \"route-controller-manager-6c5f564774-2dv56\" (UID: \"e8021891-e951-4ce4-bfc5-22c78ac8d0c2\") " pod="openshift-route-controller-manager/route-controller-manager-6c5f564774-2dv56" Feb 17 00:11:29 crc kubenswrapper[4791]: I0217 00:11:29.889348 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcq9b\" (UniqueName: \"kubernetes.io/projected/e8021891-e951-4ce4-bfc5-22c78ac8d0c2-kube-api-access-qcq9b\") pod \"route-controller-manager-6c5f564774-2dv56\" (UID: \"e8021891-e951-4ce4-bfc5-22c78ac8d0c2\") " pod="openshift-route-controller-manager/route-controller-manager-6c5f564774-2dv56" Feb 17 00:11:30 crc kubenswrapper[4791]: I0217 00:11:30.034896 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c5f564774-2dv56" Feb 17 00:11:30 crc kubenswrapper[4791]: I0217 00:11:30.479091 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c5f564774-2dv56"] Feb 17 00:11:30 crc kubenswrapper[4791]: I0217 00:11:30.688336 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c5f564774-2dv56" event={"ID":"e8021891-e951-4ce4-bfc5-22c78ac8d0c2","Type":"ContainerStarted","Data":"d9a3e0d0ff27ebc95a49c73d761ad6d0b1a174dd5a21a00e31b713daf40a5d4a"} Feb 17 00:11:30 crc kubenswrapper[4791]: I0217 00:11:30.688695 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6c5f564774-2dv56" Feb 17 00:11:30 crc kubenswrapper[4791]: I0217 00:11:30.688709 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c5f564774-2dv56" event={"ID":"e8021891-e951-4ce4-bfc5-22c78ac8d0c2","Type":"ContainerStarted","Data":"0762f5fbe709465d429332d01dd3c07ae7649c8132c8ffc6d91ebe8371f3c7ab"} Feb 17 00:11:30 crc kubenswrapper[4791]: I0217 00:11:30.709559 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6c5f564774-2dv56" podStartSLOduration=3.7095335499999997 podStartE2EDuration="3.70953355s" podCreationTimestamp="2026-02-17 00:11:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:11:30.707207619 +0000 UTC m=+348.186720146" watchObservedRunningTime="2026-02-17 00:11:30.70953355 +0000 UTC m=+348.189046117" Feb 17 00:11:31 crc kubenswrapper[4791]: I0217 00:11:31.366754 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6c5f564774-2dv56" Feb 17 00:11:54 crc kubenswrapper[4791]: I0217 00:11:54.973456 4791 patch_prober.go:28] interesting pod/machine-config-daemon-9klkw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 00:11:54 crc kubenswrapper[4791]: I0217 00:11:54.974193 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.219764 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-vxszn"] Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.221183 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.232530 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-vxszn"] Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.284124 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgsg7\" (UniqueName: \"kubernetes.io/projected/78c3872a-ce72-48e1-aee7-a3a20b86759c-kube-api-access-xgsg7\") pod \"image-registry-66df7c8f76-vxszn\" (UID: \"78c3872a-ce72-48e1-aee7-a3a20b86759c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.284193 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/78c3872a-ce72-48e1-aee7-a3a20b86759c-ca-trust-extracted\") pod \"image-registry-66df7c8f76-vxszn\" (UID: \"78c3872a-ce72-48e1-aee7-a3a20b86759c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.284218 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78c3872a-ce72-48e1-aee7-a3a20b86759c-trusted-ca\") pod \"image-registry-66df7c8f76-vxszn\" (UID: \"78c3872a-ce72-48e1-aee7-a3a20b86759c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.284329 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-vxszn\" (UID: \"78c3872a-ce72-48e1-aee7-a3a20b86759c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.284368 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/78c3872a-ce72-48e1-aee7-a3a20b86759c-installation-pull-secrets\") pod \"image-registry-66df7c8f76-vxszn\" (UID: \"78c3872a-ce72-48e1-aee7-a3a20b86759c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.284389 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/78c3872a-ce72-48e1-aee7-a3a20b86759c-registry-tls\") pod \"image-registry-66df7c8f76-vxszn\" (UID: \"78c3872a-ce72-48e1-aee7-a3a20b86759c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.284431 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/78c3872a-ce72-48e1-aee7-a3a20b86759c-registry-certificates\") pod \"image-registry-66df7c8f76-vxszn\" (UID: \"78c3872a-ce72-48e1-aee7-a3a20b86759c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.284456 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/78c3872a-ce72-48e1-aee7-a3a20b86759c-bound-sa-token\") pod \"image-registry-66df7c8f76-vxszn\" (UID: \"78c3872a-ce72-48e1-aee7-a3a20b86759c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.328253 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-vxszn\" (UID: \"78c3872a-ce72-48e1-aee7-a3a20b86759c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.386072 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/78c3872a-ce72-48e1-aee7-a3a20b86759c-installation-pull-secrets\") pod \"image-registry-66df7c8f76-vxszn\" (UID: \"78c3872a-ce72-48e1-aee7-a3a20b86759c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.386136 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/78c3872a-ce72-48e1-aee7-a3a20b86759c-registry-tls\") pod \"image-registry-66df7c8f76-vxszn\" (UID: \"78c3872a-ce72-48e1-aee7-a3a20b86759c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.386171 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/78c3872a-ce72-48e1-aee7-a3a20b86759c-registry-certificates\") pod \"image-registry-66df7c8f76-vxszn\" (UID: \"78c3872a-ce72-48e1-aee7-a3a20b86759c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.386209 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/78c3872a-ce72-48e1-aee7-a3a20b86759c-bound-sa-token\") pod \"image-registry-66df7c8f76-vxszn\" (UID: \"78c3872a-ce72-48e1-aee7-a3a20b86759c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.386247 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgsg7\" (UniqueName: \"kubernetes.io/projected/78c3872a-ce72-48e1-aee7-a3a20b86759c-kube-api-access-xgsg7\") pod \"image-registry-66df7c8f76-vxszn\" (UID: \"78c3872a-ce72-48e1-aee7-a3a20b86759c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.386265 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/78c3872a-ce72-48e1-aee7-a3a20b86759c-ca-trust-extracted\") pod \"image-registry-66df7c8f76-vxszn\" (UID: \"78c3872a-ce72-48e1-aee7-a3a20b86759c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.386294 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78c3872a-ce72-48e1-aee7-a3a20b86759c-trusted-ca\") pod \"image-registry-66df7c8f76-vxszn\" (UID: \"78c3872a-ce72-48e1-aee7-a3a20b86759c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.387674 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/78c3872a-ce72-48e1-aee7-a3a20b86759c-ca-trust-extracted\") pod \"image-registry-66df7c8f76-vxszn\" (UID: \"78c3872a-ce72-48e1-aee7-a3a20b86759c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.387767 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78c3872a-ce72-48e1-aee7-a3a20b86759c-trusted-ca\") pod \"image-registry-66df7c8f76-vxszn\" (UID: \"78c3872a-ce72-48e1-aee7-a3a20b86759c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.387697 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/78c3872a-ce72-48e1-aee7-a3a20b86759c-registry-certificates\") pod \"image-registry-66df7c8f76-vxszn\" (UID: \"78c3872a-ce72-48e1-aee7-a3a20b86759c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.397462 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/78c3872a-ce72-48e1-aee7-a3a20b86759c-installation-pull-secrets\") pod \"image-registry-66df7c8f76-vxszn\" (UID: \"78c3872a-ce72-48e1-aee7-a3a20b86759c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.397491 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/78c3872a-ce72-48e1-aee7-a3a20b86759c-registry-tls\") pod \"image-registry-66df7c8f76-vxszn\" (UID: \"78c3872a-ce72-48e1-aee7-a3a20b86759c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.403915 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/78c3872a-ce72-48e1-aee7-a3a20b86759c-bound-sa-token\") pod \"image-registry-66df7c8f76-vxszn\" (UID: \"78c3872a-ce72-48e1-aee7-a3a20b86759c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.418229 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgsg7\" (UniqueName: \"kubernetes.io/projected/78c3872a-ce72-48e1-aee7-a3a20b86759c-kube-api-access-xgsg7\") pod \"image-registry-66df7c8f76-vxszn\" (UID: \"78c3872a-ce72-48e1-aee7-a3a20b86759c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.541657 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.993273 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-vxszn"] Feb 17 00:12:01 crc kubenswrapper[4791]: W0217 00:12:01.003565 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78c3872a_ce72_48e1_aee7_a3a20b86759c.slice/crio-1adec818db4a1852947bbd28a587f809f461d74255ea5841a05a09e1a3570f95 WatchSource:0}: Error finding container 1adec818db4a1852947bbd28a587f809f461d74255ea5841a05a09e1a3570f95: Status 404 returned error can't find the container with id 1adec818db4a1852947bbd28a587f809f461d74255ea5841a05a09e1a3570f95 Feb 17 00:12:01 crc kubenswrapper[4791]: I0217 00:12:01.892673 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" event={"ID":"78c3872a-ce72-48e1-aee7-a3a20b86759c","Type":"ContainerStarted","Data":"df61d29ed86092cabced97f03c512c2cd2b013edaccab6e732d41f29b84f9639"} Feb 17 00:12:01 crc kubenswrapper[4791]: I0217 00:12:01.893010 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" event={"ID":"78c3872a-ce72-48e1-aee7-a3a20b86759c","Type":"ContainerStarted","Data":"1adec818db4a1852947bbd28a587f809f461d74255ea5841a05a09e1a3570f95"} Feb 17 00:12:01 crc kubenswrapper[4791]: I0217 00:12:01.893033 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:01 crc kubenswrapper[4791]: I0217 00:12:01.927155 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" podStartSLOduration=1.9270866500000001 podStartE2EDuration="1.92708665s" podCreationTimestamp="2026-02-17 00:12:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:12:01.918482939 +0000 UTC m=+379.397995496" watchObservedRunningTime="2026-02-17 00:12:01.92708665 +0000 UTC m=+379.406599217" Feb 17 00:12:20 crc kubenswrapper[4791]: I0217 00:12:20.549397 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:20 crc kubenswrapper[4791]: I0217 00:12:20.617029 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wfpf2"] Feb 17 00:12:24 crc kubenswrapper[4791]: I0217 00:12:24.973532 4791 patch_prober.go:28] interesting pod/machine-config-daemon-9klkw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 00:12:24 crc kubenswrapper[4791]: I0217 00:12:24.974199 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 00:12:45 crc kubenswrapper[4791]: I0217 00:12:45.669337 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" podUID="c33165ce-519a-4b0e-b62a-f153d38fc14c" containerName="registry" containerID="cri-o://13523832c1775f98b7ffe1d732a20c6ea35372708ee330562d595940dbcc07f3" gracePeriod=30 Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.063594 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.167276 4791 generic.go:334] "Generic (PLEG): container finished" podID="c33165ce-519a-4b0e-b62a-f153d38fc14c" containerID="13523832c1775f98b7ffe1d732a20c6ea35372708ee330562d595940dbcc07f3" exitCode=0 Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.167338 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" event={"ID":"c33165ce-519a-4b0e-b62a-f153d38fc14c","Type":"ContainerDied","Data":"13523832c1775f98b7ffe1d732a20c6ea35372708ee330562d595940dbcc07f3"} Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.167375 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" event={"ID":"c33165ce-519a-4b0e-b62a-f153d38fc14c","Type":"ContainerDied","Data":"a7ca3d122288b33fa8a847c41ca82278c8736040a34df9221628fbf85c038b55"} Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.167400 4791 scope.go:117] "RemoveContainer" containerID="13523832c1775f98b7ffe1d732a20c6ea35372708ee330562d595940dbcc07f3" Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.167560 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.182283 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c33165ce-519a-4b0e-b62a-f153d38fc14c-ca-trust-extracted\") pod \"c33165ce-519a-4b0e-b62a-f153d38fc14c\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.182567 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"c33165ce-519a-4b0e-b62a-f153d38fc14c\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.182629 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c33165ce-519a-4b0e-b62a-f153d38fc14c-trusted-ca\") pod \"c33165ce-519a-4b0e-b62a-f153d38fc14c\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.182678 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c33165ce-519a-4b0e-b62a-f153d38fc14c-bound-sa-token\") pod \"c33165ce-519a-4b0e-b62a-f153d38fc14c\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.182743 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c33165ce-519a-4b0e-b62a-f153d38fc14c-installation-pull-secrets\") pod \"c33165ce-519a-4b0e-b62a-f153d38fc14c\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.182801 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c33165ce-519a-4b0e-b62a-f153d38fc14c-registry-certificates\") pod \"c33165ce-519a-4b0e-b62a-f153d38fc14c\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.182837 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c33165ce-519a-4b0e-b62a-f153d38fc14c-registry-tls\") pod \"c33165ce-519a-4b0e-b62a-f153d38fc14c\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.182875 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptqw5\" (UniqueName: \"kubernetes.io/projected/c33165ce-519a-4b0e-b62a-f153d38fc14c-kube-api-access-ptqw5\") pod \"c33165ce-519a-4b0e-b62a-f153d38fc14c\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.184493 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c33165ce-519a-4b0e-b62a-f153d38fc14c-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "c33165ce-519a-4b0e-b62a-f153d38fc14c" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.186381 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c33165ce-519a-4b0e-b62a-f153d38fc14c-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "c33165ce-519a-4b0e-b62a-f153d38fc14c" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.197634 4791 scope.go:117] "RemoveContainer" containerID="13523832c1775f98b7ffe1d732a20c6ea35372708ee330562d595940dbcc07f3" Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.197850 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c33165ce-519a-4b0e-b62a-f153d38fc14c-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "c33165ce-519a-4b0e-b62a-f153d38fc14c" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.198436 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c33165ce-519a-4b0e-b62a-f153d38fc14c-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "c33165ce-519a-4b0e-b62a-f153d38fc14c" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:12:46 crc kubenswrapper[4791]: E0217 00:12:46.199683 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13523832c1775f98b7ffe1d732a20c6ea35372708ee330562d595940dbcc07f3\": container with ID starting with 13523832c1775f98b7ffe1d732a20c6ea35372708ee330562d595940dbcc07f3 not found: ID does not exist" containerID="13523832c1775f98b7ffe1d732a20c6ea35372708ee330562d595940dbcc07f3" Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.199731 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13523832c1775f98b7ffe1d732a20c6ea35372708ee330562d595940dbcc07f3"} err="failed to get container status \"13523832c1775f98b7ffe1d732a20c6ea35372708ee330562d595940dbcc07f3\": rpc error: code = NotFound desc = could not find container \"13523832c1775f98b7ffe1d732a20c6ea35372708ee330562d595940dbcc07f3\": container with ID starting with 13523832c1775f98b7ffe1d732a20c6ea35372708ee330562d595940dbcc07f3 not found: ID does not exist" Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.200470 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c33165ce-519a-4b0e-b62a-f153d38fc14c-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "c33165ce-519a-4b0e-b62a-f153d38fc14c" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.201451 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c33165ce-519a-4b0e-b62a-f153d38fc14c-kube-api-access-ptqw5" (OuterVolumeSpecName: "kube-api-access-ptqw5") pod "c33165ce-519a-4b0e-b62a-f153d38fc14c" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c"). InnerVolumeSpecName "kube-api-access-ptqw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.207966 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c33165ce-519a-4b0e-b62a-f153d38fc14c-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "c33165ce-519a-4b0e-b62a-f153d38fc14c" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.209222 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "c33165ce-519a-4b0e-b62a-f153d38fc14c" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.283914 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptqw5\" (UniqueName: \"kubernetes.io/projected/c33165ce-519a-4b0e-b62a-f153d38fc14c-kube-api-access-ptqw5\") on node \"crc\" DevicePath \"\"" Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.283956 4791 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c33165ce-519a-4b0e-b62a-f153d38fc14c-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.283972 4791 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c33165ce-519a-4b0e-b62a-f153d38fc14c-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.284150 4791 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c33165ce-519a-4b0e-b62a-f153d38fc14c-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.284219 4791 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c33165ce-519a-4b0e-b62a-f153d38fc14c-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.284240 4791 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c33165ce-519a-4b0e-b62a-f153d38fc14c-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.284255 4791 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c33165ce-519a-4b0e-b62a-f153d38fc14c-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.520765 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wfpf2"] Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.529475 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wfpf2"] Feb 17 00:12:47 crc kubenswrapper[4791]: I0217 00:12:47.233422 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c33165ce-519a-4b0e-b62a-f153d38fc14c" path="/var/lib/kubelet/pods/c33165ce-519a-4b0e-b62a-f153d38fc14c/volumes" Feb 17 00:12:54 crc kubenswrapper[4791]: I0217 00:12:54.972676 4791 patch_prober.go:28] interesting pod/machine-config-daemon-9klkw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 00:12:54 crc kubenswrapper[4791]: I0217 00:12:54.973046 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 00:12:54 crc kubenswrapper[4791]: I0217 00:12:54.973154 4791 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" Feb 17 00:12:54 crc kubenswrapper[4791]: I0217 00:12:54.973942 4791 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9f0176ad0d56ad2a09e7895c1ba79efcf2efe001559c2c246d6cab82cbcdac2f"} pod="openshift-machine-config-operator/machine-config-daemon-9klkw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 00:12:54 crc kubenswrapper[4791]: I0217 00:12:54.974034 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" containerID="cri-o://9f0176ad0d56ad2a09e7895c1ba79efcf2efe001559c2c246d6cab82cbcdac2f" gracePeriod=600 Feb 17 00:12:55 crc kubenswrapper[4791]: I0217 00:12:55.237409 4791 generic.go:334] "Generic (PLEG): container finished" podID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerID="9f0176ad0d56ad2a09e7895c1ba79efcf2efe001559c2c246d6cab82cbcdac2f" exitCode=0 Feb 17 00:12:55 crc kubenswrapper[4791]: I0217 00:12:55.237621 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" event={"ID":"02a3a228-86d6-4d54-ad63-0d36c9d59af5","Type":"ContainerDied","Data":"9f0176ad0d56ad2a09e7895c1ba79efcf2efe001559c2c246d6cab82cbcdac2f"} Feb 17 00:12:55 crc kubenswrapper[4791]: I0217 00:12:55.238217 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" event={"ID":"02a3a228-86d6-4d54-ad63-0d36c9d59af5","Type":"ContainerStarted","Data":"5711491f64a78b6ffc6b1c8acbd4b44d9c0260c393f0431c6d0856df0b4e56a4"} Feb 17 00:12:55 crc kubenswrapper[4791]: I0217 00:12:55.238252 4791 scope.go:117] "RemoveContainer" containerID="7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28" Feb 17 00:14:43 crc kubenswrapper[4791]: I0217 00:14:43.454304 4791 scope.go:117] "RemoveContainer" containerID="eccdaaec958face5d5dcfd6c5fddbc0b4b13a69d1c98503b7abb4bafa6bcd4bc" Feb 17 00:15:00 crc kubenswrapper[4791]: I0217 00:15:00.168374 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521455-psxw9"] Feb 17 00:15:00 crc kubenswrapper[4791]: E0217 00:15:00.169886 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c33165ce-519a-4b0e-b62a-f153d38fc14c" containerName="registry" Feb 17 00:15:00 crc kubenswrapper[4791]: I0217 00:15:00.169969 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="c33165ce-519a-4b0e-b62a-f153d38fc14c" containerName="registry" Feb 17 00:15:00 crc kubenswrapper[4791]: I0217 00:15:00.170143 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="c33165ce-519a-4b0e-b62a-f153d38fc14c" containerName="registry" Feb 17 00:15:00 crc kubenswrapper[4791]: I0217 00:15:00.170549 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521455-psxw9" Feb 17 00:15:00 crc kubenswrapper[4791]: I0217 00:15:00.172384 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 00:15:00 crc kubenswrapper[4791]: I0217 00:15:00.172852 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 00:15:00 crc kubenswrapper[4791]: I0217 00:15:00.182000 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521455-psxw9"] Feb 17 00:15:00 crc kubenswrapper[4791]: I0217 00:15:00.291550 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtccn\" (UniqueName: \"kubernetes.io/projected/369d6cd5-3681-44c7-b799-b8e0e9bf2a65-kube-api-access-jtccn\") pod \"collect-profiles-29521455-psxw9\" (UID: \"369d6cd5-3681-44c7-b799-b8e0e9bf2a65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521455-psxw9" Feb 17 00:15:00 crc kubenswrapper[4791]: I0217 00:15:00.291934 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/369d6cd5-3681-44c7-b799-b8e0e9bf2a65-secret-volume\") pod \"collect-profiles-29521455-psxw9\" (UID: \"369d6cd5-3681-44c7-b799-b8e0e9bf2a65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521455-psxw9" Feb 17 00:15:00 crc kubenswrapper[4791]: I0217 00:15:00.292096 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/369d6cd5-3681-44c7-b799-b8e0e9bf2a65-config-volume\") pod \"collect-profiles-29521455-psxw9\" (UID: \"369d6cd5-3681-44c7-b799-b8e0e9bf2a65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521455-psxw9" Feb 17 00:15:00 crc kubenswrapper[4791]: I0217 00:15:00.392662 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/369d6cd5-3681-44c7-b799-b8e0e9bf2a65-config-volume\") pod \"collect-profiles-29521455-psxw9\" (UID: \"369d6cd5-3681-44c7-b799-b8e0e9bf2a65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521455-psxw9" Feb 17 00:15:00 crc kubenswrapper[4791]: I0217 00:15:00.392718 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtccn\" (UniqueName: \"kubernetes.io/projected/369d6cd5-3681-44c7-b799-b8e0e9bf2a65-kube-api-access-jtccn\") pod \"collect-profiles-29521455-psxw9\" (UID: \"369d6cd5-3681-44c7-b799-b8e0e9bf2a65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521455-psxw9" Feb 17 00:15:00 crc kubenswrapper[4791]: I0217 00:15:00.392759 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/369d6cd5-3681-44c7-b799-b8e0e9bf2a65-secret-volume\") pod \"collect-profiles-29521455-psxw9\" (UID: \"369d6cd5-3681-44c7-b799-b8e0e9bf2a65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521455-psxw9" Feb 17 00:15:00 crc kubenswrapper[4791]: I0217 00:15:00.393616 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/369d6cd5-3681-44c7-b799-b8e0e9bf2a65-config-volume\") pod \"collect-profiles-29521455-psxw9\" (UID: \"369d6cd5-3681-44c7-b799-b8e0e9bf2a65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521455-psxw9" Feb 17 00:15:00 crc kubenswrapper[4791]: I0217 00:15:00.409062 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/369d6cd5-3681-44c7-b799-b8e0e9bf2a65-secret-volume\") pod \"collect-profiles-29521455-psxw9\" (UID: \"369d6cd5-3681-44c7-b799-b8e0e9bf2a65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521455-psxw9" Feb 17 00:15:00 crc kubenswrapper[4791]: I0217 00:15:00.410087 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtccn\" (UniqueName: \"kubernetes.io/projected/369d6cd5-3681-44c7-b799-b8e0e9bf2a65-kube-api-access-jtccn\") pod \"collect-profiles-29521455-psxw9\" (UID: \"369d6cd5-3681-44c7-b799-b8e0e9bf2a65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521455-psxw9" Feb 17 00:15:00 crc kubenswrapper[4791]: I0217 00:15:00.491975 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521455-psxw9" Feb 17 00:15:00 crc kubenswrapper[4791]: I0217 00:15:00.689938 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521455-psxw9"] Feb 17 00:15:01 crc kubenswrapper[4791]: I0217 00:15:01.205428 4791 generic.go:334] "Generic (PLEG): container finished" podID="369d6cd5-3681-44c7-b799-b8e0e9bf2a65" containerID="99223c1a137749948cf8689e01f08f71f3aa17cb6c92040b021d417d8ae7e17e" exitCode=0 Feb 17 00:15:01 crc kubenswrapper[4791]: I0217 00:15:01.205619 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521455-psxw9" event={"ID":"369d6cd5-3681-44c7-b799-b8e0e9bf2a65","Type":"ContainerDied","Data":"99223c1a137749948cf8689e01f08f71f3aa17cb6c92040b021d417d8ae7e17e"} Feb 17 00:15:01 crc kubenswrapper[4791]: I0217 00:15:01.207331 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521455-psxw9" event={"ID":"369d6cd5-3681-44c7-b799-b8e0e9bf2a65","Type":"ContainerStarted","Data":"a142eef943ee69deda70b59ed181fe0b2f60b27aafc5b4fe4a8fe33752d4aff1"} Feb 17 00:15:02 crc kubenswrapper[4791]: I0217 00:15:02.545100 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521455-psxw9" Feb 17 00:15:02 crc kubenswrapper[4791]: I0217 00:15:02.618015 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/369d6cd5-3681-44c7-b799-b8e0e9bf2a65-secret-volume\") pod \"369d6cd5-3681-44c7-b799-b8e0e9bf2a65\" (UID: \"369d6cd5-3681-44c7-b799-b8e0e9bf2a65\") " Feb 17 00:15:02 crc kubenswrapper[4791]: I0217 00:15:02.618251 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/369d6cd5-3681-44c7-b799-b8e0e9bf2a65-config-volume\") pod \"369d6cd5-3681-44c7-b799-b8e0e9bf2a65\" (UID: \"369d6cd5-3681-44c7-b799-b8e0e9bf2a65\") " Feb 17 00:15:02 crc kubenswrapper[4791]: I0217 00:15:02.618337 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtccn\" (UniqueName: \"kubernetes.io/projected/369d6cd5-3681-44c7-b799-b8e0e9bf2a65-kube-api-access-jtccn\") pod \"369d6cd5-3681-44c7-b799-b8e0e9bf2a65\" (UID: \"369d6cd5-3681-44c7-b799-b8e0e9bf2a65\") " Feb 17 00:15:02 crc kubenswrapper[4791]: I0217 00:15:02.618996 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/369d6cd5-3681-44c7-b799-b8e0e9bf2a65-config-volume" (OuterVolumeSpecName: "config-volume") pod "369d6cd5-3681-44c7-b799-b8e0e9bf2a65" (UID: "369d6cd5-3681-44c7-b799-b8e0e9bf2a65"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:15:02 crc kubenswrapper[4791]: I0217 00:15:02.623604 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/369d6cd5-3681-44c7-b799-b8e0e9bf2a65-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "369d6cd5-3681-44c7-b799-b8e0e9bf2a65" (UID: "369d6cd5-3681-44c7-b799-b8e0e9bf2a65"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:15:02 crc kubenswrapper[4791]: I0217 00:15:02.623989 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/369d6cd5-3681-44c7-b799-b8e0e9bf2a65-kube-api-access-jtccn" (OuterVolumeSpecName: "kube-api-access-jtccn") pod "369d6cd5-3681-44c7-b799-b8e0e9bf2a65" (UID: "369d6cd5-3681-44c7-b799-b8e0e9bf2a65"). InnerVolumeSpecName "kube-api-access-jtccn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:15:02 crc kubenswrapper[4791]: I0217 00:15:02.720621 4791 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/369d6cd5-3681-44c7-b799-b8e0e9bf2a65-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 00:15:02 crc kubenswrapper[4791]: I0217 00:15:02.720672 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtccn\" (UniqueName: \"kubernetes.io/projected/369d6cd5-3681-44c7-b799-b8e0e9bf2a65-kube-api-access-jtccn\") on node \"crc\" DevicePath \"\"" Feb 17 00:15:02 crc kubenswrapper[4791]: I0217 00:15:02.720707 4791 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/369d6cd5-3681-44c7-b799-b8e0e9bf2a65-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 00:15:03 crc kubenswrapper[4791]: I0217 00:15:03.227027 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521455-psxw9" Feb 17 00:15:03 crc kubenswrapper[4791]: I0217 00:15:03.231613 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521455-psxw9" event={"ID":"369d6cd5-3681-44c7-b799-b8e0e9bf2a65","Type":"ContainerDied","Data":"a142eef943ee69deda70b59ed181fe0b2f60b27aafc5b4fe4a8fe33752d4aff1"} Feb 17 00:15:03 crc kubenswrapper[4791]: I0217 00:15:03.231684 4791 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a142eef943ee69deda70b59ed181fe0b2f60b27aafc5b4fe4a8fe33752d4aff1" Feb 17 00:15:06 crc kubenswrapper[4791]: I0217 00:15:06.209303 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hldzt"] Feb 17 00:15:06 crc kubenswrapper[4791]: I0217 00:15:06.210527 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="ovn-controller" containerID="cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932" gracePeriod=30 Feb 17 00:15:06 crc kubenswrapper[4791]: I0217 00:15:06.210583 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="nbdb" containerID="cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6" gracePeriod=30 Feb 17 00:15:06 crc kubenswrapper[4791]: I0217 00:15:06.210657 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="sbdb" containerID="cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3" gracePeriod=30 Feb 17 00:15:06 crc kubenswrapper[4791]: I0217 00:15:06.210755 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="ovn-acl-logging" containerID="cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e" gracePeriod=30 Feb 17 00:15:06 crc kubenswrapper[4791]: I0217 00:15:06.210774 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55" gracePeriod=30 Feb 17 00:15:06 crc kubenswrapper[4791]: I0217 00:15:06.210790 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="kube-rbac-proxy-node" containerID="cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87" gracePeriod=30 Feb 17 00:15:06 crc kubenswrapper[4791]: I0217 00:15:06.210803 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="northd" containerID="cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1" gracePeriod=30 Feb 17 00:15:06 crc kubenswrapper[4791]: I0217 00:15:06.274896 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="ovnkube-controller" containerID="cri-o://ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf" gracePeriod=30 Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.036602 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hldzt_e7fe508f-1e8c-4da7-8f99-108e73cb3791/ovnkube-controller/3.log" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.043434 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hldzt_e7fe508f-1e8c-4da7-8f99-108e73cb3791/ovn-acl-logging/0.log" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.044735 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hldzt_e7fe508f-1e8c-4da7-8f99-108e73cb3791/ovn-controller/0.log" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.045665 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.140175 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-6mh9d"] Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.140718 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="ovnkube-controller" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.140858 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="ovnkube-controller" Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.140939 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="nbdb" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.141010 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="nbdb" Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.141082 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="northd" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.141182 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="northd" Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.141263 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="ovn-controller" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.141328 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="ovn-controller" Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.141403 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="sbdb" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.141471 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="sbdb" Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.141537 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="ovnkube-controller" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.141608 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="ovnkube-controller" Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.141681 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="369d6cd5-3681-44c7-b799-b8e0e9bf2a65" containerName="collect-profiles" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.141747 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="369d6cd5-3681-44c7-b799-b8e0e9bf2a65" containerName="collect-profiles" Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.141809 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="kube-rbac-proxy-node" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.141885 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="kube-rbac-proxy-node" Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.141954 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="ovnkube-controller" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.142016 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="ovnkube-controller" Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.142089 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="kube-rbac-proxy-ovn-metrics" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.142184 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="kube-rbac-proxy-ovn-metrics" Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.142254 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="kubecfg-setup" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.142329 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="kubecfg-setup" Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.142400 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="ovn-acl-logging" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.142465 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="ovn-acl-logging" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.142653 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="sbdb" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.142734 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="ovn-acl-logging" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.142810 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="ovnkube-controller" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.142879 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="nbdb" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.142948 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="ovnkube-controller" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.143020 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="northd" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.143088 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="kube-rbac-proxy-node" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.143209 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="kube-rbac-proxy-ovn-metrics" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.143274 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="ovnkube-controller" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.143333 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="ovnkube-controller" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.143401 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="ovnkube-controller" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.143467 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="369d6cd5-3681-44c7-b799-b8e0e9bf2a65" containerName="collect-profiles" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.143534 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="ovn-controller" Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.143706 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="ovnkube-controller" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.143771 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="ovnkube-controller" Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.144050 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="ovnkube-controller" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.144144 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="ovnkube-controller" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.146499 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185226 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-var-lib-cni-networks-ovn-kubernetes\") pod \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185285 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-cni-bin\") pod \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185321 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-run-ovn-kubernetes\") pod \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185359 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e7fe508f-1e8c-4da7-8f99-108e73cb3791-ovnkube-script-lib\") pod \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185402 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-etc-openvswitch\") pod \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185438 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-var-lib-openvswitch\") pod \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185469 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-slash\") pod \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185499 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-cni-netd\") pod \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185533 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e7fe508f-1e8c-4da7-8f99-108e73cb3791-ovnkube-config\") pod \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185567 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r26vg\" (UniqueName: \"kubernetes.io/projected/e7fe508f-1e8c-4da7-8f99-108e73cb3791-kube-api-access-r26vg\") pod \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185599 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-log-socket\") pod \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185629 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-node-log\") pod \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185657 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-kubelet\") pod \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185659 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "e7fe508f-1e8c-4da7-8f99-108e73cb3791" (UID: "e7fe508f-1e8c-4da7-8f99-108e73cb3791"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185699 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-run-netns\") pod \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185734 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e7fe508f-1e8c-4da7-8f99-108e73cb3791-ovn-node-metrics-cert\") pod \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185752 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "e7fe508f-1e8c-4da7-8f99-108e73cb3791" (UID: "e7fe508f-1e8c-4da7-8f99-108e73cb3791"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185772 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-systemd-units\") pod \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185808 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "e7fe508f-1e8c-4da7-8f99-108e73cb3791" (UID: "e7fe508f-1e8c-4da7-8f99-108e73cb3791"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185844 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "e7fe508f-1e8c-4da7-8f99-108e73cb3791" (UID: "e7fe508f-1e8c-4da7-8f99-108e73cb3791"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185844 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-run-openvswitch\") pod \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185892 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-run-ovn\") pod \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185937 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e7fe508f-1e8c-4da7-8f99-108e73cb3791-env-overrides\") pod \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185967 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-run-systemd\") pod \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.186296 4791 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.186325 4791 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.186346 4791 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.186362 4791 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185890 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "e7fe508f-1e8c-4da7-8f99-108e73cb3791" (UID: "e7fe508f-1e8c-4da7-8f99-108e73cb3791"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185932 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "e7fe508f-1e8c-4da7-8f99-108e73cb3791" (UID: "e7fe508f-1e8c-4da7-8f99-108e73cb3791"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185967 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "e7fe508f-1e8c-4da7-8f99-108e73cb3791" (UID: "e7fe508f-1e8c-4da7-8f99-108e73cb3791"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.186561 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-slash" (OuterVolumeSpecName: "host-slash") pod "e7fe508f-1e8c-4da7-8f99-108e73cb3791" (UID: "e7fe508f-1e8c-4da7-8f99-108e73cb3791"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.186608 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "e7fe508f-1e8c-4da7-8f99-108e73cb3791" (UID: "e7fe508f-1e8c-4da7-8f99-108e73cb3791"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.187427 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7fe508f-1e8c-4da7-8f99-108e73cb3791-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "e7fe508f-1e8c-4da7-8f99-108e73cb3791" (UID: "e7fe508f-1e8c-4da7-8f99-108e73cb3791"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.187476 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "e7fe508f-1e8c-4da7-8f99-108e73cb3791" (UID: "e7fe508f-1e8c-4da7-8f99-108e73cb3791"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.187523 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "e7fe508f-1e8c-4da7-8f99-108e73cb3791" (UID: "e7fe508f-1e8c-4da7-8f99-108e73cb3791"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.187594 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "e7fe508f-1e8c-4da7-8f99-108e73cb3791" (UID: "e7fe508f-1e8c-4da7-8f99-108e73cb3791"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.187627 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7fe508f-1e8c-4da7-8f99-108e73cb3791-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "e7fe508f-1e8c-4da7-8f99-108e73cb3791" (UID: "e7fe508f-1e8c-4da7-8f99-108e73cb3791"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.187636 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-log-socket" (OuterVolumeSpecName: "log-socket") pod "e7fe508f-1e8c-4da7-8f99-108e73cb3791" (UID: "e7fe508f-1e8c-4da7-8f99-108e73cb3791"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.188057 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7fe508f-1e8c-4da7-8f99-108e73cb3791-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "e7fe508f-1e8c-4da7-8f99-108e73cb3791" (UID: "e7fe508f-1e8c-4da7-8f99-108e73cb3791"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.189619 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-node-log" (OuterVolumeSpecName: "node-log") pod "e7fe508f-1e8c-4da7-8f99-108e73cb3791" (UID: "e7fe508f-1e8c-4da7-8f99-108e73cb3791"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.194831 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7fe508f-1e8c-4da7-8f99-108e73cb3791-kube-api-access-r26vg" (OuterVolumeSpecName: "kube-api-access-r26vg") pod "e7fe508f-1e8c-4da7-8f99-108e73cb3791" (UID: "e7fe508f-1e8c-4da7-8f99-108e73cb3791"). InnerVolumeSpecName "kube-api-access-r26vg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.194849 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7fe508f-1e8c-4da7-8f99-108e73cb3791-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "e7fe508f-1e8c-4da7-8f99-108e73cb3791" (UID: "e7fe508f-1e8c-4da7-8f99-108e73cb3791"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.206427 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "e7fe508f-1e8c-4da7-8f99-108e73cb3791" (UID: "e7fe508f-1e8c-4da7-8f99-108e73cb3791"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.278280 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hldzt_e7fe508f-1e8c-4da7-8f99-108e73cb3791/ovnkube-controller/3.log" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.281206 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hldzt_e7fe508f-1e8c-4da7-8f99-108e73cb3791/ovn-acl-logging/0.log" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.281743 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hldzt_e7fe508f-1e8c-4da7-8f99-108e73cb3791/ovn-controller/0.log" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282243 4791 generic.go:334] "Generic (PLEG): container finished" podID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerID="ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf" exitCode=0 Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282289 4791 generic.go:334] "Generic (PLEG): container finished" podID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerID="ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3" exitCode=0 Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282302 4791 generic.go:334] "Generic (PLEG): container finished" podID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerID="605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6" exitCode=0 Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282311 4791 generic.go:334] "Generic (PLEG): container finished" podID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerID="12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1" exitCode=0 Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282319 4791 generic.go:334] "Generic (PLEG): container finished" podID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerID="0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55" exitCode=0 Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282327 4791 generic.go:334] "Generic (PLEG): container finished" podID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerID="74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87" exitCode=0 Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282335 4791 generic.go:334] "Generic (PLEG): container finished" podID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerID="e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e" exitCode=143 Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282344 4791 generic.go:334] "Generic (PLEG): container finished" podID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerID="142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932" exitCode=143 Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282359 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerDied","Data":"ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282398 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerDied","Data":"ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282409 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerDied","Data":"605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282418 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerDied","Data":"12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282396 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282429 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerDied","Data":"0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282501 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerDied","Data":"74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282523 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282536 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282570 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282578 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282585 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282592 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282599 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282604 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282611 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282622 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerDied","Data":"e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282662 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282672 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282680 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282687 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282733 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282746 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282753 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282761 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282768 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282774 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282785 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerDied","Data":"142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282819 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282833 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282841 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282848 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282855 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282861 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282868 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282874 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282881 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282887 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282898 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerDied","Data":"8359c5871ee1aee2d63af5dec0cce97a0b6622d7bd312c2093b490d8e6067659"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282909 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282918 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282925 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282954 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282963 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282970 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282977 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282984 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282991 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282999 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282757 4791 scope.go:117] "RemoveContainer" containerID="ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.286514 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-299s7_1104c109-74aa-4fc4-8a1b-914a0d5803a4/kube-multus/2.log" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.288470 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-299s7_1104c109-74aa-4fc4-8a1b-914a0d5803a4/kube-multus/1.log" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.288549 4791 generic.go:334] "Generic (PLEG): container finished" podID="1104c109-74aa-4fc4-8a1b-914a0d5803a4" containerID="583de3e38aa471160a347992af3add013ed63c250a69e768895a5ca5ef559bea" exitCode=2 Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.288580 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-299s7" event={"ID":"1104c109-74aa-4fc4-8a1b-914a0d5803a4","Type":"ContainerDied","Data":"583de3e38aa471160a347992af3add013ed63c250a69e768895a5ca5ef559bea"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.288604 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6db675f531aa30ad1f83255bf72a659dd7e60aaecdd681515208407c414c903c"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.288946 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-host-cni-netd\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.288977 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-ovn-node-metrics-cert\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.289029 4791 scope.go:117] "RemoveContainer" containerID="583de3e38aa471160a347992af3add013ed63c250a69e768895a5ca5ef559bea" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.289029 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-systemd-units\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.289194 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-host-run-netns\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.289347 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.289352 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-299s7_openshift-multus(1104c109-74aa-4fc4-8a1b-914a0d5803a4)\"" pod="openshift-multus/multus-299s7" podUID="1104c109-74aa-4fc4-8a1b-914a0d5803a4" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.289386 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-log-socket\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.289485 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-node-log\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.289529 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-run-openvswitch\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.289575 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-etc-openvswitch\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.289613 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-host-kubelet\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.289634 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-ovnkube-config\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.289655 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-host-run-ovn-kubernetes\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.289677 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-ovnkube-script-lib\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.289788 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-run-systemd\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.289835 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-host-slash\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.289921 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-run-ovn\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.289961 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-env-overrides\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.290021 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnddl\" (UniqueName: \"kubernetes.io/projected/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-kube-api-access-wnddl\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.290054 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-host-cni-bin\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.290075 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-var-lib-openvswitch\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.290197 4791 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-slash\") on node \"crc\" DevicePath \"\"" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.290220 4791 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.290232 4791 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e7fe508f-1e8c-4da7-8f99-108e73cb3791-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.290243 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r26vg\" (UniqueName: \"kubernetes.io/projected/e7fe508f-1e8c-4da7-8f99-108e73cb3791-kube-api-access-r26vg\") on node \"crc\" DevicePath \"\"" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.290281 4791 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-log-socket\") on node \"crc\" DevicePath \"\"" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.290293 4791 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-node-log\") on node \"crc\" DevicePath \"\"" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.290303 4791 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.290314 4791 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.290325 4791 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e7fe508f-1e8c-4da7-8f99-108e73cb3791-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.290359 4791 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.290372 4791 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.290383 4791 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e7fe508f-1e8c-4da7-8f99-108e73cb3791-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.290393 4791 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.290404 4791 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.290439 4791 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.290452 4791 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e7fe508f-1e8c-4da7-8f99-108e73cb3791-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.310724 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hldzt"] Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.315021 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hldzt"] Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.317863 4791 scope.go:117] "RemoveContainer" containerID="b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.360832 4791 scope.go:117] "RemoveContainer" containerID="ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.380996 4791 scope.go:117] "RemoveContainer" containerID="605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.391688 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-run-ovn\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.391718 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-env-overrides\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.391738 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnddl\" (UniqueName: \"kubernetes.io/projected/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-kube-api-access-wnddl\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.391752 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-host-cni-bin\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.391768 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-var-lib-openvswitch\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.391793 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-host-cni-netd\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.391809 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-ovn-node-metrics-cert\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.391824 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-systemd-units\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.391838 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-host-run-netns\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.391855 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.391871 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-log-socket\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.391892 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-node-log\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.391912 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-run-openvswitch\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.391932 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-etc-openvswitch\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.391949 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-host-kubelet\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.391964 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-ovnkube-config\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.391982 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-host-run-ovn-kubernetes\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.392003 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-ovnkube-script-lib\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.392034 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-run-systemd\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.392070 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-host-slash\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.392169 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-host-slash\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.392376 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.392466 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-host-cni-netd\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.392535 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-var-lib-openvswitch\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.392528 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-node-log\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.392503 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-run-openvswitch\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.392959 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-run-ovn\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.392644 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-run-systemd\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.392678 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-host-kubelet\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.392666 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-etc-openvswitch\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.392713 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-systemd-units\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.392718 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-host-run-netns\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.392735 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-host-cni-bin\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.392627 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-host-run-ovn-kubernetes\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.393085 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-ovnkube-config\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.393155 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-log-socket\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.393505 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-env-overrides\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.394731 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-ovnkube-script-lib\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.399715 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-ovn-node-metrics-cert\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.406337 4791 scope.go:117] "RemoveContainer" containerID="12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.408594 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnddl\" (UniqueName: \"kubernetes.io/projected/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-kube-api-access-wnddl\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.447407 4791 scope.go:117] "RemoveContainer" containerID="0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.466358 4791 scope.go:117] "RemoveContainer" containerID="74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.468343 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.484209 4791 scope.go:117] "RemoveContainer" containerID="e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.499976 4791 scope.go:117] "RemoveContainer" containerID="142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.519295 4791 scope.go:117] "RemoveContainer" containerID="880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.535026 4791 scope.go:117] "RemoveContainer" containerID="ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf" Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.535513 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf\": container with ID starting with ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf not found: ID does not exist" containerID="ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.535550 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf"} err="failed to get container status \"ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf\": rpc error: code = NotFound desc = could not find container \"ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf\": container with ID starting with ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.535579 4791 scope.go:117] "RemoveContainer" containerID="b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644" Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.535849 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644\": container with ID starting with b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644 not found: ID does not exist" containerID="b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.535876 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644"} err="failed to get container status \"b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644\": rpc error: code = NotFound desc = could not find container \"b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644\": container with ID starting with b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.535894 4791 scope.go:117] "RemoveContainer" containerID="ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3" Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.536457 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\": container with ID starting with ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3 not found: ID does not exist" containerID="ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.536502 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3"} err="failed to get container status \"ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\": rpc error: code = NotFound desc = could not find container \"ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\": container with ID starting with ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.536530 4791 scope.go:117] "RemoveContainer" containerID="605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6" Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.536844 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\": container with ID starting with 605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6 not found: ID does not exist" containerID="605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.536888 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6"} err="failed to get container status \"605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\": rpc error: code = NotFound desc = could not find container \"605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\": container with ID starting with 605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.536915 4791 scope.go:117] "RemoveContainer" containerID="12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1" Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.537404 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\": container with ID starting with 12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1 not found: ID does not exist" containerID="12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.537429 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1"} err="failed to get container status \"12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\": rpc error: code = NotFound desc = could not find container \"12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\": container with ID starting with 12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.537445 4791 scope.go:117] "RemoveContainer" containerID="0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55" Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.537699 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\": container with ID starting with 0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55 not found: ID does not exist" containerID="0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.537739 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55"} err="failed to get container status \"0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\": rpc error: code = NotFound desc = could not find container \"0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\": container with ID starting with 0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.537764 4791 scope.go:117] "RemoveContainer" containerID="74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87" Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.538046 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\": container with ID starting with 74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87 not found: ID does not exist" containerID="74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.538071 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87"} err="failed to get container status \"74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\": rpc error: code = NotFound desc = could not find container \"74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\": container with ID starting with 74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.538130 4791 scope.go:117] "RemoveContainer" containerID="e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e" Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.538378 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\": container with ID starting with e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e not found: ID does not exist" containerID="e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.538441 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e"} err="failed to get container status \"e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\": rpc error: code = NotFound desc = could not find container \"e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\": container with ID starting with e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.538467 4791 scope.go:117] "RemoveContainer" containerID="142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932" Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.538817 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\": container with ID starting with 142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932 not found: ID does not exist" containerID="142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.538853 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932"} err="failed to get container status \"142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\": rpc error: code = NotFound desc = could not find container \"142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\": container with ID starting with 142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.538880 4791 scope.go:117] "RemoveContainer" containerID="880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6" Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.539370 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\": container with ID starting with 880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6 not found: ID does not exist" containerID="880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.539402 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6"} err="failed to get container status \"880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\": rpc error: code = NotFound desc = could not find container \"880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\": container with ID starting with 880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.539436 4791 scope.go:117] "RemoveContainer" containerID="ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.539694 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf"} err="failed to get container status \"ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf\": rpc error: code = NotFound desc = could not find container \"ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf\": container with ID starting with ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.539721 4791 scope.go:117] "RemoveContainer" containerID="b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.540085 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644"} err="failed to get container status \"b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644\": rpc error: code = NotFound desc = could not find container \"b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644\": container with ID starting with b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.540152 4791 scope.go:117] "RemoveContainer" containerID="ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.540485 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3"} err="failed to get container status \"ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\": rpc error: code = NotFound desc = could not find container \"ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\": container with ID starting with ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.540545 4791 scope.go:117] "RemoveContainer" containerID="605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.541205 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6"} err="failed to get container status \"605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\": rpc error: code = NotFound desc = could not find container \"605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\": container with ID starting with 605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.541240 4791 scope.go:117] "RemoveContainer" containerID="12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.541477 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1"} err="failed to get container status \"12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\": rpc error: code = NotFound desc = could not find container \"12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\": container with ID starting with 12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.541501 4791 scope.go:117] "RemoveContainer" containerID="0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.543624 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55"} err="failed to get container status \"0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\": rpc error: code = NotFound desc = could not find container \"0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\": container with ID starting with 0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.543664 4791 scope.go:117] "RemoveContainer" containerID="74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.543994 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87"} err="failed to get container status \"74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\": rpc error: code = NotFound desc = could not find container \"74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\": container with ID starting with 74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.544033 4791 scope.go:117] "RemoveContainer" containerID="e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.544429 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e"} err="failed to get container status \"e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\": rpc error: code = NotFound desc = could not find container \"e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\": container with ID starting with e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.544460 4791 scope.go:117] "RemoveContainer" containerID="142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.544791 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932"} err="failed to get container status \"142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\": rpc error: code = NotFound desc = could not find container \"142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\": container with ID starting with 142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.544817 4791 scope.go:117] "RemoveContainer" containerID="880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.545143 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6"} err="failed to get container status \"880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\": rpc error: code = NotFound desc = could not find container \"880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\": container with ID starting with 880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.545174 4791 scope.go:117] "RemoveContainer" containerID="ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.545492 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf"} err="failed to get container status \"ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf\": rpc error: code = NotFound desc = could not find container \"ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf\": container with ID starting with ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.545518 4791 scope.go:117] "RemoveContainer" containerID="b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.545819 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644"} err="failed to get container status \"b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644\": rpc error: code = NotFound desc = could not find container \"b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644\": container with ID starting with b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.545851 4791 scope.go:117] "RemoveContainer" containerID="ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.546285 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3"} err="failed to get container status \"ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\": rpc error: code = NotFound desc = could not find container \"ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\": container with ID starting with ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.546309 4791 scope.go:117] "RemoveContainer" containerID="605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.546606 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6"} err="failed to get container status \"605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\": rpc error: code = NotFound desc = could not find container \"605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\": container with ID starting with 605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.546637 4791 scope.go:117] "RemoveContainer" containerID="12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.546896 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1"} err="failed to get container status \"12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\": rpc error: code = NotFound desc = could not find container \"12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\": container with ID starting with 12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.546924 4791 scope.go:117] "RemoveContainer" containerID="0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.547312 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55"} err="failed to get container status \"0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\": rpc error: code = NotFound desc = could not find container \"0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\": container with ID starting with 0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.547349 4791 scope.go:117] "RemoveContainer" containerID="74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.547863 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87"} err="failed to get container status \"74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\": rpc error: code = NotFound desc = could not find container \"74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\": container with ID starting with 74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.547891 4791 scope.go:117] "RemoveContainer" containerID="e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.548183 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e"} err="failed to get container status \"e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\": rpc error: code = NotFound desc = could not find container \"e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\": container with ID starting with e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.548213 4791 scope.go:117] "RemoveContainer" containerID="142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.548575 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932"} err="failed to get container status \"142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\": rpc error: code = NotFound desc = could not find container \"142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\": container with ID starting with 142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.548603 4791 scope.go:117] "RemoveContainer" containerID="880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.548857 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6"} err="failed to get container status \"880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\": rpc error: code = NotFound desc = could not find container \"880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\": container with ID starting with 880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.548886 4791 scope.go:117] "RemoveContainer" containerID="ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.549191 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf"} err="failed to get container status \"ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf\": rpc error: code = NotFound desc = could not find container \"ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf\": container with ID starting with ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.549221 4791 scope.go:117] "RemoveContainer" containerID="b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.549507 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644"} err="failed to get container status \"b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644\": rpc error: code = NotFound desc = could not find container \"b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644\": container with ID starting with b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.549534 4791 scope.go:117] "RemoveContainer" containerID="ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.549780 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3"} err="failed to get container status \"ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\": rpc error: code = NotFound desc = could not find container \"ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\": container with ID starting with ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.549807 4791 scope.go:117] "RemoveContainer" containerID="605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.550176 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6"} err="failed to get container status \"605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\": rpc error: code = NotFound desc = could not find container \"605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\": container with ID starting with 605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.550200 4791 scope.go:117] "RemoveContainer" containerID="12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.550563 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1"} err="failed to get container status \"12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\": rpc error: code = NotFound desc = could not find container \"12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\": container with ID starting with 12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.550609 4791 scope.go:117] "RemoveContainer" containerID="0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.550924 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55"} err="failed to get container status \"0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\": rpc error: code = NotFound desc = could not find container \"0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\": container with ID starting with 0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.550955 4791 scope.go:117] "RemoveContainer" containerID="74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.551356 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87"} err="failed to get container status \"74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\": rpc error: code = NotFound desc = could not find container \"74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\": container with ID starting with 74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.551389 4791 scope.go:117] "RemoveContainer" containerID="e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.551711 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e"} err="failed to get container status \"e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\": rpc error: code = NotFound desc = could not find container \"e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\": container with ID starting with e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.551739 4791 scope.go:117] "RemoveContainer" containerID="142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.552078 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932"} err="failed to get container status \"142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\": rpc error: code = NotFound desc = could not find container \"142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\": container with ID starting with 142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.552127 4791 scope.go:117] "RemoveContainer" containerID="880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.552427 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6"} err="failed to get container status \"880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\": rpc error: code = NotFound desc = could not find container \"880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\": container with ID starting with 880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.552455 4791 scope.go:117] "RemoveContainer" containerID="ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.552691 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf"} err="failed to get container status \"ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf\": rpc error: code = NotFound desc = could not find container \"ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf\": container with ID starting with ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf not found: ID does not exist" Feb 17 00:15:08 crc kubenswrapper[4791]: I0217 00:15:08.298086 4791 generic.go:334] "Generic (PLEG): container finished" podID="b4f2b02f-cfc0-42a9-832d-adb0268cc26d" containerID="585bacda4be2074c95265d66c30093285f0e868bffc068a4d3024a00f52106f7" exitCode=0 Feb 17 00:15:08 crc kubenswrapper[4791]: I0217 00:15:08.298150 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" event={"ID":"b4f2b02f-cfc0-42a9-832d-adb0268cc26d","Type":"ContainerDied","Data":"585bacda4be2074c95265d66c30093285f0e868bffc068a4d3024a00f52106f7"} Feb 17 00:15:08 crc kubenswrapper[4791]: I0217 00:15:08.298458 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" event={"ID":"b4f2b02f-cfc0-42a9-832d-adb0268cc26d","Type":"ContainerStarted","Data":"512468c2f81ce78eb3c74943d6bcb5168342c22846b822ee264278c60f5d33cd"} Feb 17 00:15:09 crc kubenswrapper[4791]: I0217 00:15:09.230284 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" path="/var/lib/kubelet/pods/e7fe508f-1e8c-4da7-8f99-108e73cb3791/volumes" Feb 17 00:15:09 crc kubenswrapper[4791]: I0217 00:15:09.311700 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" event={"ID":"b4f2b02f-cfc0-42a9-832d-adb0268cc26d","Type":"ContainerStarted","Data":"90e65f8c4449195dd5f2d09a2fc95b311cfbdd8f7490988154f053dff6c96d25"} Feb 17 00:15:09 crc kubenswrapper[4791]: I0217 00:15:09.311748 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" event={"ID":"b4f2b02f-cfc0-42a9-832d-adb0268cc26d","Type":"ContainerStarted","Data":"7a766979695ee77b4349b273f68f6946cf77a3df27993971861adc5c90b2150d"} Feb 17 00:15:09 crc kubenswrapper[4791]: I0217 00:15:09.311766 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" event={"ID":"b4f2b02f-cfc0-42a9-832d-adb0268cc26d","Type":"ContainerStarted","Data":"4321bb07633317ea5c5d2c079e48ffe11aea22b184cb12bf4465c6d02470e055"} Feb 17 00:15:09 crc kubenswrapper[4791]: I0217 00:15:09.311776 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" event={"ID":"b4f2b02f-cfc0-42a9-832d-adb0268cc26d","Type":"ContainerStarted","Data":"77b859ce02b9ed54f03c44b5ba8a45e0fd01d5247eecd158ae4b0652397f5b8b"} Feb 17 00:15:09 crc kubenswrapper[4791]: I0217 00:15:09.311785 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" event={"ID":"b4f2b02f-cfc0-42a9-832d-adb0268cc26d","Type":"ContainerStarted","Data":"f85e14266e55e68e5b052b3fd0448e23cf9430ef8d3abc131ced675b9545599a"} Feb 17 00:15:09 crc kubenswrapper[4791]: I0217 00:15:09.311795 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" event={"ID":"b4f2b02f-cfc0-42a9-832d-adb0268cc26d","Type":"ContainerStarted","Data":"4fd8cb0e6ae478527d55bfae465a834cfd420e5c83378db28477e6272c75336b"} Feb 17 00:15:12 crc kubenswrapper[4791]: I0217 00:15:12.339193 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" event={"ID":"b4f2b02f-cfc0-42a9-832d-adb0268cc26d","Type":"ContainerStarted","Data":"8b01a32a7b8802fec40cdd1a91fddabe6819202e7f25b363c9ee688843e5ddc1"} Feb 17 00:15:14 crc kubenswrapper[4791]: I0217 00:15:14.352751 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" event={"ID":"b4f2b02f-cfc0-42a9-832d-adb0268cc26d","Type":"ContainerStarted","Data":"1210297e17828fcc2f04ee549013a7ffa2a2a0b7da6fbbbf1801ccc000d6e576"} Feb 17 00:15:14 crc kubenswrapper[4791]: I0217 00:15:14.353085 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:14 crc kubenswrapper[4791]: I0217 00:15:14.353265 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:14 crc kubenswrapper[4791]: I0217 00:15:14.353278 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:14 crc kubenswrapper[4791]: I0217 00:15:14.387418 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" podStartSLOduration=7.387403271 podStartE2EDuration="7.387403271s" podCreationTimestamp="2026-02-17 00:15:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:15:14.386781911 +0000 UTC m=+571.866294438" watchObservedRunningTime="2026-02-17 00:15:14.387403271 +0000 UTC m=+571.866915798" Feb 17 00:15:14 crc kubenswrapper[4791]: I0217 00:15:14.405907 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:14 crc kubenswrapper[4791]: I0217 00:15:14.407765 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:20 crc kubenswrapper[4791]: I0217 00:15:20.221168 4791 scope.go:117] "RemoveContainer" containerID="583de3e38aa471160a347992af3add013ed63c250a69e768895a5ca5ef559bea" Feb 17 00:15:20 crc kubenswrapper[4791]: E0217 00:15:20.221875 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-299s7_openshift-multus(1104c109-74aa-4fc4-8a1b-914a0d5803a4)\"" pod="openshift-multus/multus-299s7" podUID="1104c109-74aa-4fc4-8a1b-914a0d5803a4" Feb 17 00:15:24 crc kubenswrapper[4791]: I0217 00:15:24.973048 4791 patch_prober.go:28] interesting pod/machine-config-daemon-9klkw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 00:15:24 crc kubenswrapper[4791]: I0217 00:15:24.974449 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 00:15:33 crc kubenswrapper[4791]: I0217 00:15:33.222748 4791 scope.go:117] "RemoveContainer" containerID="583de3e38aa471160a347992af3add013ed63c250a69e768895a5ca5ef559bea" Feb 17 00:15:33 crc kubenswrapper[4791]: I0217 00:15:33.486269 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-299s7_1104c109-74aa-4fc4-8a1b-914a0d5803a4/kube-multus/2.log" Feb 17 00:15:33 crc kubenswrapper[4791]: I0217 00:15:33.487524 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-299s7_1104c109-74aa-4fc4-8a1b-914a0d5803a4/kube-multus/1.log" Feb 17 00:15:33 crc kubenswrapper[4791]: I0217 00:15:33.487615 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-299s7" event={"ID":"1104c109-74aa-4fc4-8a1b-914a0d5803a4","Type":"ContainerStarted","Data":"e2e684b560a24deca25b7db8c32d2dceb7734559d94b660c79340702f0e6bb29"} Feb 17 00:15:37 crc kubenswrapper[4791]: I0217 00:15:37.500171 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:43 crc kubenswrapper[4791]: I0217 00:15:43.513991 4791 scope.go:117] "RemoveContainer" containerID="6db675f531aa30ad1f83255bf72a659dd7e60aaecdd681515208407c414c903c" Feb 17 00:15:44 crc kubenswrapper[4791]: I0217 00:15:44.558657 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-299s7_1104c109-74aa-4fc4-8a1b-914a0d5803a4/kube-multus/2.log" Feb 17 00:15:54 crc kubenswrapper[4791]: I0217 00:15:54.972643 4791 patch_prober.go:28] interesting pod/machine-config-daemon-9klkw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 00:15:54 crc kubenswrapper[4791]: I0217 00:15:54.973024 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 00:16:07 crc kubenswrapper[4791]: I0217 00:16:07.693093 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lbgxw"] Feb 17 00:16:07 crc kubenswrapper[4791]: I0217 00:16:07.694226 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lbgxw" podUID="ce989914-c2c6-4717-9acb-161dd734b4f6" containerName="registry-server" containerID="cri-o://c1899b35ee23af37b31e777d967d615f8c6498d7792dff26cf2154a7a42f8751" gracePeriod=30 Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.032892 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lbgxw" Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.161875 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce989914-c2c6-4717-9acb-161dd734b4f6-catalog-content\") pod \"ce989914-c2c6-4717-9acb-161dd734b4f6\" (UID: \"ce989914-c2c6-4717-9acb-161dd734b4f6\") " Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.161943 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rz657\" (UniqueName: \"kubernetes.io/projected/ce989914-c2c6-4717-9acb-161dd734b4f6-kube-api-access-rz657\") pod \"ce989914-c2c6-4717-9acb-161dd734b4f6\" (UID: \"ce989914-c2c6-4717-9acb-161dd734b4f6\") " Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.162021 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce989914-c2c6-4717-9acb-161dd734b4f6-utilities\") pod \"ce989914-c2c6-4717-9acb-161dd734b4f6\" (UID: \"ce989914-c2c6-4717-9acb-161dd734b4f6\") " Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.163597 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce989914-c2c6-4717-9acb-161dd734b4f6-utilities" (OuterVolumeSpecName: "utilities") pod "ce989914-c2c6-4717-9acb-161dd734b4f6" (UID: "ce989914-c2c6-4717-9acb-161dd734b4f6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.168596 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce989914-c2c6-4717-9acb-161dd734b4f6-kube-api-access-rz657" (OuterVolumeSpecName: "kube-api-access-rz657") pod "ce989914-c2c6-4717-9acb-161dd734b4f6" (UID: "ce989914-c2c6-4717-9acb-161dd734b4f6"). InnerVolumeSpecName "kube-api-access-rz657". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.194379 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce989914-c2c6-4717-9acb-161dd734b4f6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ce989914-c2c6-4717-9acb-161dd734b4f6" (UID: "ce989914-c2c6-4717-9acb-161dd734b4f6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.264100 4791 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce989914-c2c6-4717-9acb-161dd734b4f6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.264168 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rz657\" (UniqueName: \"kubernetes.io/projected/ce989914-c2c6-4717-9acb-161dd734b4f6-kube-api-access-rz657\") on node \"crc\" DevicePath \"\"" Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.264184 4791 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce989914-c2c6-4717-9acb-161dd734b4f6-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.727875 4791 generic.go:334] "Generic (PLEG): container finished" podID="ce989914-c2c6-4717-9acb-161dd734b4f6" containerID="c1899b35ee23af37b31e777d967d615f8c6498d7792dff26cf2154a7a42f8751" exitCode=0 Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.727931 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbgxw" event={"ID":"ce989914-c2c6-4717-9acb-161dd734b4f6","Type":"ContainerDied","Data":"c1899b35ee23af37b31e777d967d615f8c6498d7792dff26cf2154a7a42f8751"} Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.727955 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lbgxw" Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.727982 4791 scope.go:117] "RemoveContainer" containerID="c1899b35ee23af37b31e777d967d615f8c6498d7792dff26cf2154a7a42f8751" Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.727966 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbgxw" event={"ID":"ce989914-c2c6-4717-9acb-161dd734b4f6","Type":"ContainerDied","Data":"0c0c0ef37f45961765809fc7c0c9b4244d7c69f4387e7e7fe8ce8a7787ea122a"} Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.746166 4791 scope.go:117] "RemoveContainer" containerID="e6388094dea6e5f3fc85b3f7c61dbf221cb823d36d986f1dd74b4e94b0c276b2" Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.760540 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lbgxw"] Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.764348 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lbgxw"] Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.787530 4791 scope.go:117] "RemoveContainer" containerID="3a4f0fbfdaf9197d1c655a275dbc9b73975f167cf4abd10a9fd33aab398c4195" Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.803711 4791 scope.go:117] "RemoveContainer" containerID="c1899b35ee23af37b31e777d967d615f8c6498d7792dff26cf2154a7a42f8751" Feb 17 00:16:08 crc kubenswrapper[4791]: E0217 00:16:08.804349 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1899b35ee23af37b31e777d967d615f8c6498d7792dff26cf2154a7a42f8751\": container with ID starting with c1899b35ee23af37b31e777d967d615f8c6498d7792dff26cf2154a7a42f8751 not found: ID does not exist" containerID="c1899b35ee23af37b31e777d967d615f8c6498d7792dff26cf2154a7a42f8751" Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.804397 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1899b35ee23af37b31e777d967d615f8c6498d7792dff26cf2154a7a42f8751"} err="failed to get container status \"c1899b35ee23af37b31e777d967d615f8c6498d7792dff26cf2154a7a42f8751\": rpc error: code = NotFound desc = could not find container \"c1899b35ee23af37b31e777d967d615f8c6498d7792dff26cf2154a7a42f8751\": container with ID starting with c1899b35ee23af37b31e777d967d615f8c6498d7792dff26cf2154a7a42f8751 not found: ID does not exist" Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.804429 4791 scope.go:117] "RemoveContainer" containerID="e6388094dea6e5f3fc85b3f7c61dbf221cb823d36d986f1dd74b4e94b0c276b2" Feb 17 00:16:08 crc kubenswrapper[4791]: E0217 00:16:08.804884 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6388094dea6e5f3fc85b3f7c61dbf221cb823d36d986f1dd74b4e94b0c276b2\": container with ID starting with e6388094dea6e5f3fc85b3f7c61dbf221cb823d36d986f1dd74b4e94b0c276b2 not found: ID does not exist" containerID="e6388094dea6e5f3fc85b3f7c61dbf221cb823d36d986f1dd74b4e94b0c276b2" Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.804922 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6388094dea6e5f3fc85b3f7c61dbf221cb823d36d986f1dd74b4e94b0c276b2"} err="failed to get container status \"e6388094dea6e5f3fc85b3f7c61dbf221cb823d36d986f1dd74b4e94b0c276b2\": rpc error: code = NotFound desc = could not find container \"e6388094dea6e5f3fc85b3f7c61dbf221cb823d36d986f1dd74b4e94b0c276b2\": container with ID starting with e6388094dea6e5f3fc85b3f7c61dbf221cb823d36d986f1dd74b4e94b0c276b2 not found: ID does not exist" Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.804950 4791 scope.go:117] "RemoveContainer" containerID="3a4f0fbfdaf9197d1c655a275dbc9b73975f167cf4abd10a9fd33aab398c4195" Feb 17 00:16:08 crc kubenswrapper[4791]: E0217 00:16:08.805402 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a4f0fbfdaf9197d1c655a275dbc9b73975f167cf4abd10a9fd33aab398c4195\": container with ID starting with 3a4f0fbfdaf9197d1c655a275dbc9b73975f167cf4abd10a9fd33aab398c4195 not found: ID does not exist" containerID="3a4f0fbfdaf9197d1c655a275dbc9b73975f167cf4abd10a9fd33aab398c4195" Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.805441 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a4f0fbfdaf9197d1c655a275dbc9b73975f167cf4abd10a9fd33aab398c4195"} err="failed to get container status \"3a4f0fbfdaf9197d1c655a275dbc9b73975f167cf4abd10a9fd33aab398c4195\": rpc error: code = NotFound desc = could not find container \"3a4f0fbfdaf9197d1c655a275dbc9b73975f167cf4abd10a9fd33aab398c4195\": container with ID starting with 3a4f0fbfdaf9197d1c655a275dbc9b73975f167cf4abd10a9fd33aab398c4195 not found: ID does not exist" Feb 17 00:16:09 crc kubenswrapper[4791]: I0217 00:16:09.234578 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce989914-c2c6-4717-9acb-161dd734b4f6" path="/var/lib/kubelet/pods/ce989914-c2c6-4717-9acb-161dd734b4f6/volumes" Feb 17 00:16:11 crc kubenswrapper[4791]: I0217 00:16:11.511047 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl"] Feb 17 00:16:11 crc kubenswrapper[4791]: E0217 00:16:11.511302 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce989914-c2c6-4717-9acb-161dd734b4f6" containerName="extract-utilities" Feb 17 00:16:11 crc kubenswrapper[4791]: I0217 00:16:11.511317 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce989914-c2c6-4717-9acb-161dd734b4f6" containerName="extract-utilities" Feb 17 00:16:11 crc kubenswrapper[4791]: E0217 00:16:11.511329 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce989914-c2c6-4717-9acb-161dd734b4f6" containerName="registry-server" Feb 17 00:16:11 crc kubenswrapper[4791]: I0217 00:16:11.511337 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce989914-c2c6-4717-9acb-161dd734b4f6" containerName="registry-server" Feb 17 00:16:11 crc kubenswrapper[4791]: E0217 00:16:11.511349 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce989914-c2c6-4717-9acb-161dd734b4f6" containerName="extract-content" Feb 17 00:16:11 crc kubenswrapper[4791]: I0217 00:16:11.511357 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce989914-c2c6-4717-9acb-161dd734b4f6" containerName="extract-content" Feb 17 00:16:11 crc kubenswrapper[4791]: I0217 00:16:11.511469 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce989914-c2c6-4717-9acb-161dd734b4f6" containerName="registry-server" Feb 17 00:16:11 crc kubenswrapper[4791]: I0217 00:16:11.512376 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl" Feb 17 00:16:11 crc kubenswrapper[4791]: I0217 00:16:11.517632 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 17 00:16:11 crc kubenswrapper[4791]: I0217 00:16:11.537072 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl"] Feb 17 00:16:11 crc kubenswrapper[4791]: I0217 00:16:11.600410 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcnq5\" (UniqueName: \"kubernetes.io/projected/306a7321-68e3-4f13-95d0-3c3dbee8b24f-kube-api-access-qcnq5\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl\" (UID: \"306a7321-68e3-4f13-95d0-3c3dbee8b24f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl" Feb 17 00:16:11 crc kubenswrapper[4791]: I0217 00:16:11.600475 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/306a7321-68e3-4f13-95d0-3c3dbee8b24f-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl\" (UID: \"306a7321-68e3-4f13-95d0-3c3dbee8b24f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl" Feb 17 00:16:11 crc kubenswrapper[4791]: I0217 00:16:11.600510 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/306a7321-68e3-4f13-95d0-3c3dbee8b24f-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl\" (UID: \"306a7321-68e3-4f13-95d0-3c3dbee8b24f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl" Feb 17 00:16:11 crc kubenswrapper[4791]: I0217 00:16:11.701690 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/306a7321-68e3-4f13-95d0-3c3dbee8b24f-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl\" (UID: \"306a7321-68e3-4f13-95d0-3c3dbee8b24f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl" Feb 17 00:16:11 crc kubenswrapper[4791]: I0217 00:16:11.701788 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcnq5\" (UniqueName: \"kubernetes.io/projected/306a7321-68e3-4f13-95d0-3c3dbee8b24f-kube-api-access-qcnq5\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl\" (UID: \"306a7321-68e3-4f13-95d0-3c3dbee8b24f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl" Feb 17 00:16:11 crc kubenswrapper[4791]: I0217 00:16:11.701842 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/306a7321-68e3-4f13-95d0-3c3dbee8b24f-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl\" (UID: \"306a7321-68e3-4f13-95d0-3c3dbee8b24f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl" Feb 17 00:16:11 crc kubenswrapper[4791]: I0217 00:16:11.702280 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/306a7321-68e3-4f13-95d0-3c3dbee8b24f-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl\" (UID: \"306a7321-68e3-4f13-95d0-3c3dbee8b24f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl" Feb 17 00:16:11 crc kubenswrapper[4791]: I0217 00:16:11.702402 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/306a7321-68e3-4f13-95d0-3c3dbee8b24f-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl\" (UID: \"306a7321-68e3-4f13-95d0-3c3dbee8b24f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl" Feb 17 00:16:11 crc kubenswrapper[4791]: I0217 00:16:11.723287 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcnq5\" (UniqueName: \"kubernetes.io/projected/306a7321-68e3-4f13-95d0-3c3dbee8b24f-kube-api-access-qcnq5\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl\" (UID: \"306a7321-68e3-4f13-95d0-3c3dbee8b24f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl" Feb 17 00:16:11 crc kubenswrapper[4791]: I0217 00:16:11.828913 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl" Feb 17 00:16:12 crc kubenswrapper[4791]: I0217 00:16:12.081410 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl"] Feb 17 00:16:12 crc kubenswrapper[4791]: I0217 00:16:12.752999 4791 generic.go:334] "Generic (PLEG): container finished" podID="306a7321-68e3-4f13-95d0-3c3dbee8b24f" containerID="2f5d3f8df0535fd228835286243796212d5a510efdc64603f9e1d050615d4714" exitCode=0 Feb 17 00:16:12 crc kubenswrapper[4791]: I0217 00:16:12.753046 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl" event={"ID":"306a7321-68e3-4f13-95d0-3c3dbee8b24f","Type":"ContainerDied","Data":"2f5d3f8df0535fd228835286243796212d5a510efdc64603f9e1d050615d4714"} Feb 17 00:16:12 crc kubenswrapper[4791]: I0217 00:16:12.753075 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl" event={"ID":"306a7321-68e3-4f13-95d0-3c3dbee8b24f","Type":"ContainerStarted","Data":"0869a21a026a58cea076884833871d1ce8ea04be88334ba3e5be2837d4fb535f"} Feb 17 00:16:12 crc kubenswrapper[4791]: I0217 00:16:12.758472 4791 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 00:16:14 crc kubenswrapper[4791]: I0217 00:16:14.770940 4791 generic.go:334] "Generic (PLEG): container finished" podID="306a7321-68e3-4f13-95d0-3c3dbee8b24f" containerID="3e95f87bb062f252217212cffa73bf3194cc09b5136dec0e11fa71b8ff76fb22" exitCode=0 Feb 17 00:16:14 crc kubenswrapper[4791]: I0217 00:16:14.771050 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl" event={"ID":"306a7321-68e3-4f13-95d0-3c3dbee8b24f","Type":"ContainerDied","Data":"3e95f87bb062f252217212cffa73bf3194cc09b5136dec0e11fa71b8ff76fb22"} Feb 17 00:16:15 crc kubenswrapper[4791]: I0217 00:16:15.784824 4791 generic.go:334] "Generic (PLEG): container finished" podID="306a7321-68e3-4f13-95d0-3c3dbee8b24f" containerID="ea53282404244d4b2707e959ca82bee2f7679d571902217b5dd1d3b672e7ad12" exitCode=0 Feb 17 00:16:15 crc kubenswrapper[4791]: I0217 00:16:15.784917 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl" event={"ID":"306a7321-68e3-4f13-95d0-3c3dbee8b24f","Type":"ContainerDied","Data":"ea53282404244d4b2707e959ca82bee2f7679d571902217b5dd1d3b672e7ad12"} Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.105656 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl" Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.178198 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/306a7321-68e3-4f13-95d0-3c3dbee8b24f-util\") pod \"306a7321-68e3-4f13-95d0-3c3dbee8b24f\" (UID: \"306a7321-68e3-4f13-95d0-3c3dbee8b24f\") " Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.178561 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/306a7321-68e3-4f13-95d0-3c3dbee8b24f-bundle\") pod \"306a7321-68e3-4f13-95d0-3c3dbee8b24f\" (UID: \"306a7321-68e3-4f13-95d0-3c3dbee8b24f\") " Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.178619 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcnq5\" (UniqueName: \"kubernetes.io/projected/306a7321-68e3-4f13-95d0-3c3dbee8b24f-kube-api-access-qcnq5\") pod \"306a7321-68e3-4f13-95d0-3c3dbee8b24f\" (UID: \"306a7321-68e3-4f13-95d0-3c3dbee8b24f\") " Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.192564 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/306a7321-68e3-4f13-95d0-3c3dbee8b24f-kube-api-access-qcnq5" (OuterVolumeSpecName: "kube-api-access-qcnq5") pod "306a7321-68e3-4f13-95d0-3c3dbee8b24f" (UID: "306a7321-68e3-4f13-95d0-3c3dbee8b24f"). InnerVolumeSpecName "kube-api-access-qcnq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.196164 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/306a7321-68e3-4f13-95d0-3c3dbee8b24f-bundle" (OuterVolumeSpecName: "bundle") pod "306a7321-68e3-4f13-95d0-3c3dbee8b24f" (UID: "306a7321-68e3-4f13-95d0-3c3dbee8b24f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.215455 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/306a7321-68e3-4f13-95d0-3c3dbee8b24f-util" (OuterVolumeSpecName: "util") pod "306a7321-68e3-4f13-95d0-3c3dbee8b24f" (UID: "306a7321-68e3-4f13-95d0-3c3dbee8b24f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.280464 4791 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/306a7321-68e3-4f13-95d0-3c3dbee8b24f-util\") on node \"crc\" DevicePath \"\"" Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.280520 4791 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/306a7321-68e3-4f13-95d0-3c3dbee8b24f-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.280539 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcnq5\" (UniqueName: \"kubernetes.io/projected/306a7321-68e3-4f13-95d0-3c3dbee8b24f-kube-api-access-qcnq5\") on node \"crc\" DevicePath \"\"" Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.485066 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r"] Feb 17 00:16:17 crc kubenswrapper[4791]: E0217 00:16:17.485388 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="306a7321-68e3-4f13-95d0-3c3dbee8b24f" containerName="pull" Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.485410 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="306a7321-68e3-4f13-95d0-3c3dbee8b24f" containerName="pull" Feb 17 00:16:17 crc kubenswrapper[4791]: E0217 00:16:17.485423 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="306a7321-68e3-4f13-95d0-3c3dbee8b24f" containerName="extract" Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.485433 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="306a7321-68e3-4f13-95d0-3c3dbee8b24f" containerName="extract" Feb 17 00:16:17 crc kubenswrapper[4791]: E0217 00:16:17.485457 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="306a7321-68e3-4f13-95d0-3c3dbee8b24f" containerName="util" Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.485469 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="306a7321-68e3-4f13-95d0-3c3dbee8b24f" containerName="util" Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.485636 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="306a7321-68e3-4f13-95d0-3c3dbee8b24f" containerName="extract" Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.487220 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r" Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.503404 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r"] Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.584126 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bfce146a-61fa-4821-ab43-8fd35dc5fe07-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r\" (UID: \"bfce146a-61fa-4821-ab43-8fd35dc5fe07\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r" Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.584410 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhrzd\" (UniqueName: \"kubernetes.io/projected/bfce146a-61fa-4821-ab43-8fd35dc5fe07-kube-api-access-lhrzd\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r\" (UID: \"bfce146a-61fa-4821-ab43-8fd35dc5fe07\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r" Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.584500 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bfce146a-61fa-4821-ab43-8fd35dc5fe07-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r\" (UID: \"bfce146a-61fa-4821-ab43-8fd35dc5fe07\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r" Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.686346 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhrzd\" (UniqueName: \"kubernetes.io/projected/bfce146a-61fa-4821-ab43-8fd35dc5fe07-kube-api-access-lhrzd\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r\" (UID: \"bfce146a-61fa-4821-ab43-8fd35dc5fe07\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r" Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.686439 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bfce146a-61fa-4821-ab43-8fd35dc5fe07-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r\" (UID: \"bfce146a-61fa-4821-ab43-8fd35dc5fe07\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r" Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.686513 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bfce146a-61fa-4821-ab43-8fd35dc5fe07-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r\" (UID: \"bfce146a-61fa-4821-ab43-8fd35dc5fe07\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r" Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.687397 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bfce146a-61fa-4821-ab43-8fd35dc5fe07-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r\" (UID: \"bfce146a-61fa-4821-ab43-8fd35dc5fe07\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r" Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.687501 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bfce146a-61fa-4821-ab43-8fd35dc5fe07-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r\" (UID: \"bfce146a-61fa-4821-ab43-8fd35dc5fe07\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r" Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.717227 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhrzd\" (UniqueName: \"kubernetes.io/projected/bfce146a-61fa-4821-ab43-8fd35dc5fe07-kube-api-access-lhrzd\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r\" (UID: \"bfce146a-61fa-4821-ab43-8fd35dc5fe07\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r" Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.802088 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl" event={"ID":"306a7321-68e3-4f13-95d0-3c3dbee8b24f","Type":"ContainerDied","Data":"0869a21a026a58cea076884833871d1ce8ea04be88334ba3e5be2837d4fb535f"} Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.802185 4791 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0869a21a026a58cea076884833871d1ce8ea04be88334ba3e5be2837d4fb535f" Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.802244 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl" Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.816499 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r" Feb 17 00:16:18 crc kubenswrapper[4791]: I0217 00:16:18.056790 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r"] Feb 17 00:16:18 crc kubenswrapper[4791]: W0217 00:16:18.065079 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfce146a_61fa_4821_ab43_8fd35dc5fe07.slice/crio-4ca4de3d6b02231c967c39cfd48ff6ae6a608b530c4a5c85b60edd49c810d8b1 WatchSource:0}: Error finding container 4ca4de3d6b02231c967c39cfd48ff6ae6a608b530c4a5c85b60edd49c810d8b1: Status 404 returned error can't find the container with id 4ca4de3d6b02231c967c39cfd48ff6ae6a608b530c4a5c85b60edd49c810d8b1 Feb 17 00:16:18 crc kubenswrapper[4791]: I0217 00:16:18.262876 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb"] Feb 17 00:16:18 crc kubenswrapper[4791]: I0217 00:16:18.264407 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb" Feb 17 00:16:18 crc kubenswrapper[4791]: I0217 00:16:18.275178 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb"] Feb 17 00:16:18 crc kubenswrapper[4791]: I0217 00:16:18.395648 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f156dae1-1d4a-47b3-835e-016325f1981c-util\") pod \"00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb\" (UID: \"f156dae1-1d4a-47b3-835e-016325f1981c\") " pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb" Feb 17 00:16:18 crc kubenswrapper[4791]: I0217 00:16:18.395726 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f156dae1-1d4a-47b3-835e-016325f1981c-bundle\") pod \"00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb\" (UID: \"f156dae1-1d4a-47b3-835e-016325f1981c\") " pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb" Feb 17 00:16:18 crc kubenswrapper[4791]: I0217 00:16:18.395798 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqjfw\" (UniqueName: \"kubernetes.io/projected/f156dae1-1d4a-47b3-835e-016325f1981c-kube-api-access-rqjfw\") pod \"00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb\" (UID: \"f156dae1-1d4a-47b3-835e-016325f1981c\") " pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb" Feb 17 00:16:18 crc kubenswrapper[4791]: I0217 00:16:18.496889 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqjfw\" (UniqueName: \"kubernetes.io/projected/f156dae1-1d4a-47b3-835e-016325f1981c-kube-api-access-rqjfw\") pod \"00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb\" (UID: \"f156dae1-1d4a-47b3-835e-016325f1981c\") " pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb" Feb 17 00:16:18 crc kubenswrapper[4791]: I0217 00:16:18.497471 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f156dae1-1d4a-47b3-835e-016325f1981c-util\") pod \"00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb\" (UID: \"f156dae1-1d4a-47b3-835e-016325f1981c\") " pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb" Feb 17 00:16:18 crc kubenswrapper[4791]: I0217 00:16:18.497777 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f156dae1-1d4a-47b3-835e-016325f1981c-bundle\") pod \"00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb\" (UID: \"f156dae1-1d4a-47b3-835e-016325f1981c\") " pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb" Feb 17 00:16:18 crc kubenswrapper[4791]: I0217 00:16:18.498383 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f156dae1-1d4a-47b3-835e-016325f1981c-util\") pod \"00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb\" (UID: \"f156dae1-1d4a-47b3-835e-016325f1981c\") " pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb" Feb 17 00:16:18 crc kubenswrapper[4791]: I0217 00:16:18.498453 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f156dae1-1d4a-47b3-835e-016325f1981c-bundle\") pod \"00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb\" (UID: \"f156dae1-1d4a-47b3-835e-016325f1981c\") " pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb" Feb 17 00:16:18 crc kubenswrapper[4791]: I0217 00:16:18.526290 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqjfw\" (UniqueName: \"kubernetes.io/projected/f156dae1-1d4a-47b3-835e-016325f1981c-kube-api-access-rqjfw\") pod \"00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb\" (UID: \"f156dae1-1d4a-47b3-835e-016325f1981c\") " pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb" Feb 17 00:16:18 crc kubenswrapper[4791]: I0217 00:16:18.580484 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb" Feb 17 00:16:18 crc kubenswrapper[4791]: I0217 00:16:18.810903 4791 generic.go:334] "Generic (PLEG): container finished" podID="bfce146a-61fa-4821-ab43-8fd35dc5fe07" containerID="78490f2b3615a7ddb7efd4af6351c0526b6b3fa122e398e922bf6a6ec7a152b3" exitCode=0 Feb 17 00:16:18 crc kubenswrapper[4791]: I0217 00:16:18.810995 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r" event={"ID":"bfce146a-61fa-4821-ab43-8fd35dc5fe07","Type":"ContainerDied","Data":"78490f2b3615a7ddb7efd4af6351c0526b6b3fa122e398e922bf6a6ec7a152b3"} Feb 17 00:16:18 crc kubenswrapper[4791]: I0217 00:16:18.811322 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r" event={"ID":"bfce146a-61fa-4821-ab43-8fd35dc5fe07","Type":"ContainerStarted","Data":"4ca4de3d6b02231c967c39cfd48ff6ae6a608b530c4a5c85b60edd49c810d8b1"} Feb 17 00:16:19 crc kubenswrapper[4791]: I0217 00:16:19.050361 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb"] Feb 17 00:16:19 crc kubenswrapper[4791]: I0217 00:16:19.828477 4791 generic.go:334] "Generic (PLEG): container finished" podID="f156dae1-1d4a-47b3-835e-016325f1981c" containerID="68ef460952e68de60b1a58d94e99269c6162da95c7114adeff4d4e552688fd57" exitCode=0 Feb 17 00:16:19 crc kubenswrapper[4791]: I0217 00:16:19.828576 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb" event={"ID":"f156dae1-1d4a-47b3-835e-016325f1981c","Type":"ContainerDied","Data":"68ef460952e68de60b1a58d94e99269c6162da95c7114adeff4d4e552688fd57"} Feb 17 00:16:19 crc kubenswrapper[4791]: I0217 00:16:19.828831 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb" event={"ID":"f156dae1-1d4a-47b3-835e-016325f1981c","Type":"ContainerStarted","Data":"43ac63093f7b6c93cf4fb80892a768e3f1ded2f4bd27836925fe1e0e4db43b6b"} Feb 17 00:16:20 crc kubenswrapper[4791]: I0217 00:16:20.837227 4791 generic.go:334] "Generic (PLEG): container finished" podID="bfce146a-61fa-4821-ab43-8fd35dc5fe07" containerID="70da6909dff98f6a15c3e6d61c22d2123946b8b9e42b29c7114857574709a440" exitCode=0 Feb 17 00:16:20 crc kubenswrapper[4791]: I0217 00:16:20.837325 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r" event={"ID":"bfce146a-61fa-4821-ab43-8fd35dc5fe07","Type":"ContainerDied","Data":"70da6909dff98f6a15c3e6d61c22d2123946b8b9e42b29c7114857574709a440"} Feb 17 00:16:20 crc kubenswrapper[4791]: I0217 00:16:20.839934 4791 generic.go:334] "Generic (PLEG): container finished" podID="f156dae1-1d4a-47b3-835e-016325f1981c" containerID="1af4cb3c07106a022003314ab268e8d7fd6f96516f42b97a6d5809a8d5ce3225" exitCode=0 Feb 17 00:16:20 crc kubenswrapper[4791]: I0217 00:16:20.839988 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb" event={"ID":"f156dae1-1d4a-47b3-835e-016325f1981c","Type":"ContainerDied","Data":"1af4cb3c07106a022003314ab268e8d7fd6f96516f42b97a6d5809a8d5ce3225"} Feb 17 00:16:21 crc kubenswrapper[4791]: I0217 00:16:21.844990 4791 generic.go:334] "Generic (PLEG): container finished" podID="f156dae1-1d4a-47b3-835e-016325f1981c" containerID="44a4a5ae5c2f4f7f3cfac080a77f559de7e938d185002fad8d70a24cb2d0a5ee" exitCode=0 Feb 17 00:16:21 crc kubenswrapper[4791]: I0217 00:16:21.845092 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb" event={"ID":"f156dae1-1d4a-47b3-835e-016325f1981c","Type":"ContainerDied","Data":"44a4a5ae5c2f4f7f3cfac080a77f559de7e938d185002fad8d70a24cb2d0a5ee"} Feb 17 00:16:21 crc kubenswrapper[4791]: I0217 00:16:21.846984 4791 generic.go:334] "Generic (PLEG): container finished" podID="bfce146a-61fa-4821-ab43-8fd35dc5fe07" containerID="84cf57ff591c5b3543988d83f4058aa23d61fd03b9c193df5a592ba5158f2116" exitCode=0 Feb 17 00:16:21 crc kubenswrapper[4791]: I0217 00:16:21.847025 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r" event={"ID":"bfce146a-61fa-4821-ab43-8fd35dc5fe07","Type":"ContainerDied","Data":"84cf57ff591c5b3543988d83f4058aa23d61fd03b9c193df5a592ba5158f2116"} Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.258497 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb" Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.280474 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r" Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.377233 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f156dae1-1d4a-47b3-835e-016325f1981c-util\") pod \"f156dae1-1d4a-47b3-835e-016325f1981c\" (UID: \"f156dae1-1d4a-47b3-835e-016325f1981c\") " Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.377299 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhrzd\" (UniqueName: \"kubernetes.io/projected/bfce146a-61fa-4821-ab43-8fd35dc5fe07-kube-api-access-lhrzd\") pod \"bfce146a-61fa-4821-ab43-8fd35dc5fe07\" (UID: \"bfce146a-61fa-4821-ab43-8fd35dc5fe07\") " Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.377325 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bfce146a-61fa-4821-ab43-8fd35dc5fe07-bundle\") pod \"bfce146a-61fa-4821-ab43-8fd35dc5fe07\" (UID: \"bfce146a-61fa-4821-ab43-8fd35dc5fe07\") " Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.377346 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f156dae1-1d4a-47b3-835e-016325f1981c-bundle\") pod \"f156dae1-1d4a-47b3-835e-016325f1981c\" (UID: \"f156dae1-1d4a-47b3-835e-016325f1981c\") " Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.377407 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqjfw\" (UniqueName: \"kubernetes.io/projected/f156dae1-1d4a-47b3-835e-016325f1981c-kube-api-access-rqjfw\") pod \"f156dae1-1d4a-47b3-835e-016325f1981c\" (UID: \"f156dae1-1d4a-47b3-835e-016325f1981c\") " Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.377439 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bfce146a-61fa-4821-ab43-8fd35dc5fe07-util\") pod \"bfce146a-61fa-4821-ab43-8fd35dc5fe07\" (UID: \"bfce146a-61fa-4821-ab43-8fd35dc5fe07\") " Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.377914 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfce146a-61fa-4821-ab43-8fd35dc5fe07-bundle" (OuterVolumeSpecName: "bundle") pod "bfce146a-61fa-4821-ab43-8fd35dc5fe07" (UID: "bfce146a-61fa-4821-ab43-8fd35dc5fe07"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.379208 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f156dae1-1d4a-47b3-835e-016325f1981c-bundle" (OuterVolumeSpecName: "bundle") pod "f156dae1-1d4a-47b3-835e-016325f1981c" (UID: "f156dae1-1d4a-47b3-835e-016325f1981c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.384304 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f156dae1-1d4a-47b3-835e-016325f1981c-kube-api-access-rqjfw" (OuterVolumeSpecName: "kube-api-access-rqjfw") pod "f156dae1-1d4a-47b3-835e-016325f1981c" (UID: "f156dae1-1d4a-47b3-835e-016325f1981c"). InnerVolumeSpecName "kube-api-access-rqjfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.394062 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f156dae1-1d4a-47b3-835e-016325f1981c-util" (OuterVolumeSpecName: "util") pod "f156dae1-1d4a-47b3-835e-016325f1981c" (UID: "f156dae1-1d4a-47b3-835e-016325f1981c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.395814 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfce146a-61fa-4821-ab43-8fd35dc5fe07-util" (OuterVolumeSpecName: "util") pod "bfce146a-61fa-4821-ab43-8fd35dc5fe07" (UID: "bfce146a-61fa-4821-ab43-8fd35dc5fe07"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.401239 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfce146a-61fa-4821-ab43-8fd35dc5fe07-kube-api-access-lhrzd" (OuterVolumeSpecName: "kube-api-access-lhrzd") pod "bfce146a-61fa-4821-ab43-8fd35dc5fe07" (UID: "bfce146a-61fa-4821-ab43-8fd35dc5fe07"). InnerVolumeSpecName "kube-api-access-lhrzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.478597 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhrzd\" (UniqueName: \"kubernetes.io/projected/bfce146a-61fa-4821-ab43-8fd35dc5fe07-kube-api-access-lhrzd\") on node \"crc\" DevicePath \"\"" Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.478626 4791 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bfce146a-61fa-4821-ab43-8fd35dc5fe07-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.478635 4791 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f156dae1-1d4a-47b3-835e-016325f1981c-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.478643 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqjfw\" (UniqueName: \"kubernetes.io/projected/f156dae1-1d4a-47b3-835e-016325f1981c-kube-api-access-rqjfw\") on node \"crc\" DevicePath \"\"" Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.478651 4791 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bfce146a-61fa-4821-ab43-8fd35dc5fe07-util\") on node \"crc\" DevicePath \"\"" Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.478659 4791 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f156dae1-1d4a-47b3-835e-016325f1981c-util\") on node \"crc\" DevicePath \"\"" Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.859362 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb" Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.859388 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb" event={"ID":"f156dae1-1d4a-47b3-835e-016325f1981c","Type":"ContainerDied","Data":"43ac63093f7b6c93cf4fb80892a768e3f1ded2f4bd27836925fe1e0e4db43b6b"} Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.859427 4791 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43ac63093f7b6c93cf4fb80892a768e3f1ded2f4bd27836925fe1e0e4db43b6b" Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.861129 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r" event={"ID":"bfce146a-61fa-4821-ab43-8fd35dc5fe07","Type":"ContainerDied","Data":"4ca4de3d6b02231c967c39cfd48ff6ae6a608b530c4a5c85b60edd49c810d8b1"} Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.861168 4791 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ca4de3d6b02231c967c39cfd48ff6ae6a608b530c4a5c85b60edd49c810d8b1" Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.861280 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r" Feb 17 00:16:24 crc kubenswrapper[4791]: I0217 00:16:24.973520 4791 patch_prober.go:28] interesting pod/machine-config-daemon-9klkw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 00:16:24 crc kubenswrapper[4791]: I0217 00:16:24.973604 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 00:16:24 crc kubenswrapper[4791]: I0217 00:16:24.973672 4791 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" Feb 17 00:16:24 crc kubenswrapper[4791]: I0217 00:16:24.974523 4791 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5711491f64a78b6ffc6b1c8acbd4b44d9c0260c393f0431c6d0856df0b4e56a4"} pod="openshift-machine-config-operator/machine-config-daemon-9klkw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 00:16:24 crc kubenswrapper[4791]: I0217 00:16:24.974644 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" containerID="cri-o://5711491f64a78b6ffc6b1c8acbd4b44d9c0260c393f0431c6d0856df0b4e56a4" gracePeriod=600 Feb 17 00:16:25 crc kubenswrapper[4791]: I0217 00:16:25.873579 4791 generic.go:334] "Generic (PLEG): container finished" podID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerID="5711491f64a78b6ffc6b1c8acbd4b44d9c0260c393f0431c6d0856df0b4e56a4" exitCode=0 Feb 17 00:16:25 crc kubenswrapper[4791]: I0217 00:16:25.873608 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" event={"ID":"02a3a228-86d6-4d54-ad63-0d36c9d59af5","Type":"ContainerDied","Data":"5711491f64a78b6ffc6b1c8acbd4b44d9c0260c393f0431c6d0856df0b4e56a4"} Feb 17 00:16:25 crc kubenswrapper[4791]: I0217 00:16:25.874150 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" event={"ID":"02a3a228-86d6-4d54-ad63-0d36c9d59af5","Type":"ContainerStarted","Data":"25ad102ac8c402951283e13e1752712dd2cdbf609cf80aba767b2bf348b988eb"} Feb 17 00:16:25 crc kubenswrapper[4791]: I0217 00:16:25.874176 4791 scope.go:117] "RemoveContainer" containerID="9f0176ad0d56ad2a09e7895c1ba79efcf2efe001559c2c246d6cab82cbcdac2f" Feb 17 00:16:25 crc kubenswrapper[4791]: I0217 00:16:25.952298 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-rw6pj"] Feb 17 00:16:25 crc kubenswrapper[4791]: E0217 00:16:25.952507 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f156dae1-1d4a-47b3-835e-016325f1981c" containerName="util" Feb 17 00:16:25 crc kubenswrapper[4791]: I0217 00:16:25.952523 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="f156dae1-1d4a-47b3-835e-016325f1981c" containerName="util" Feb 17 00:16:25 crc kubenswrapper[4791]: E0217 00:16:25.952537 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfce146a-61fa-4821-ab43-8fd35dc5fe07" containerName="extract" Feb 17 00:16:25 crc kubenswrapper[4791]: I0217 00:16:25.952543 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfce146a-61fa-4821-ab43-8fd35dc5fe07" containerName="extract" Feb 17 00:16:25 crc kubenswrapper[4791]: E0217 00:16:25.952553 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfce146a-61fa-4821-ab43-8fd35dc5fe07" containerName="pull" Feb 17 00:16:25 crc kubenswrapper[4791]: I0217 00:16:25.952558 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfce146a-61fa-4821-ab43-8fd35dc5fe07" containerName="pull" Feb 17 00:16:25 crc kubenswrapper[4791]: E0217 00:16:25.952568 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f156dae1-1d4a-47b3-835e-016325f1981c" containerName="extract" Feb 17 00:16:25 crc kubenswrapper[4791]: I0217 00:16:25.952573 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="f156dae1-1d4a-47b3-835e-016325f1981c" containerName="extract" Feb 17 00:16:25 crc kubenswrapper[4791]: E0217 00:16:25.952582 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f156dae1-1d4a-47b3-835e-016325f1981c" containerName="pull" Feb 17 00:16:25 crc kubenswrapper[4791]: I0217 00:16:25.952588 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="f156dae1-1d4a-47b3-835e-016325f1981c" containerName="pull" Feb 17 00:16:25 crc kubenswrapper[4791]: E0217 00:16:25.952598 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfce146a-61fa-4821-ab43-8fd35dc5fe07" containerName="util" Feb 17 00:16:25 crc kubenswrapper[4791]: I0217 00:16:25.952604 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfce146a-61fa-4821-ab43-8fd35dc5fe07" containerName="util" Feb 17 00:16:25 crc kubenswrapper[4791]: I0217 00:16:25.952703 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="f156dae1-1d4a-47b3-835e-016325f1981c" containerName="extract" Feb 17 00:16:25 crc kubenswrapper[4791]: I0217 00:16:25.952712 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfce146a-61fa-4821-ab43-8fd35dc5fe07" containerName="extract" Feb 17 00:16:25 crc kubenswrapper[4791]: I0217 00:16:25.953064 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rw6pj" Feb 17 00:16:25 crc kubenswrapper[4791]: I0217 00:16:25.955917 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-rm4vf" Feb 17 00:16:25 crc kubenswrapper[4791]: I0217 00:16:25.956184 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Feb 17 00:16:25 crc kubenswrapper[4791]: I0217 00:16:25.957473 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Feb 17 00:16:25 crc kubenswrapper[4791]: I0217 00:16:25.969073 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-rw6pj"] Feb 17 00:16:25 crc kubenswrapper[4791]: I0217 00:16:25.992592 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-drxh5"] Feb 17 00:16:25 crc kubenswrapper[4791]: I0217 00:16:25.993357 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-drxh5" Feb 17 00:16:25 crc kubenswrapper[4791]: I0217 00:16:25.995058 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Feb 17 00:16:25 crc kubenswrapper[4791]: I0217 00:16:25.997247 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-6m5f5" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.006583 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-drxh5"] Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.010230 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5h5l\" (UniqueName: \"kubernetes.io/projected/f73f7b40-6611-465e-ae69-d2f70ce77651-kube-api-access-r5h5l\") pod \"obo-prometheus-operator-68bc856cb9-rw6pj\" (UID: \"f73f7b40-6611-465e-ae69-d2f70ce77651\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rw6pj" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.012133 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-6pl7f"] Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.012754 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-6pl7f" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.025581 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-6pl7f"] Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.112024 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8c43370f-07b8-4f84-b716-34af90be5850-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7b4b5fc6bd-drxh5\" (UID: \"8c43370f-07b8-4f84-b716-34af90be5850\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-drxh5" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.112101 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8c43370f-07b8-4f84-b716-34af90be5850-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7b4b5fc6bd-drxh5\" (UID: \"8c43370f-07b8-4f84-b716-34af90be5850\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-drxh5" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.112143 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b046e97f-6343-4e3f-ae0a-0fb40687d992-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7b4b5fc6bd-6pl7f\" (UID: \"b046e97f-6343-4e3f-ae0a-0fb40687d992\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-6pl7f" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.112190 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b046e97f-6343-4e3f-ae0a-0fb40687d992-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7b4b5fc6bd-6pl7f\" (UID: \"b046e97f-6343-4e3f-ae0a-0fb40687d992\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-6pl7f" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.112230 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5h5l\" (UniqueName: \"kubernetes.io/projected/f73f7b40-6611-465e-ae69-d2f70ce77651-kube-api-access-r5h5l\") pod \"obo-prometheus-operator-68bc856cb9-rw6pj\" (UID: \"f73f7b40-6611-465e-ae69-d2f70ce77651\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rw6pj" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.129406 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5h5l\" (UniqueName: \"kubernetes.io/projected/f73f7b40-6611-465e-ae69-d2f70ce77651-kube-api-access-r5h5l\") pod \"obo-prometheus-operator-68bc856cb9-rw6pj\" (UID: \"f73f7b40-6611-465e-ae69-d2f70ce77651\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rw6pj" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.179338 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-v2lwp"] Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.180699 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-v2lwp" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.195274 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.195466 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-gngdj" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.201452 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-v2lwp"] Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.213026 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8c43370f-07b8-4f84-b716-34af90be5850-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7b4b5fc6bd-drxh5\" (UID: \"8c43370f-07b8-4f84-b716-34af90be5850\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-drxh5" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.213164 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8c43370f-07b8-4f84-b716-34af90be5850-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7b4b5fc6bd-drxh5\" (UID: \"8c43370f-07b8-4f84-b716-34af90be5850\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-drxh5" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.213262 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b046e97f-6343-4e3f-ae0a-0fb40687d992-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7b4b5fc6bd-6pl7f\" (UID: \"b046e97f-6343-4e3f-ae0a-0fb40687d992\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-6pl7f" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.213374 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b046e97f-6343-4e3f-ae0a-0fb40687d992-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7b4b5fc6bd-6pl7f\" (UID: \"b046e97f-6343-4e3f-ae0a-0fb40687d992\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-6pl7f" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.216127 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b046e97f-6343-4e3f-ae0a-0fb40687d992-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7b4b5fc6bd-6pl7f\" (UID: \"b046e97f-6343-4e3f-ae0a-0fb40687d992\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-6pl7f" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.216702 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8c43370f-07b8-4f84-b716-34af90be5850-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7b4b5fc6bd-drxh5\" (UID: \"8c43370f-07b8-4f84-b716-34af90be5850\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-drxh5" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.218377 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8c43370f-07b8-4f84-b716-34af90be5850-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7b4b5fc6bd-drxh5\" (UID: \"8c43370f-07b8-4f84-b716-34af90be5850\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-drxh5" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.219152 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b046e97f-6343-4e3f-ae0a-0fb40687d992-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7b4b5fc6bd-6pl7f\" (UID: \"b046e97f-6343-4e3f-ae0a-0fb40687d992\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-6pl7f" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.277361 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rw6pj" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.310840 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-drxh5" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.314475 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/307585d5-5ed8-43df-b5d8-977729339610-observability-operator-tls\") pod \"observability-operator-59bdc8b94-v2lwp\" (UID: \"307585d5-5ed8-43df-b5d8-977729339610\") " pod="openshift-operators/observability-operator-59bdc8b94-v2lwp" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.314530 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlmv9\" (UniqueName: \"kubernetes.io/projected/307585d5-5ed8-43df-b5d8-977729339610-kube-api-access-zlmv9\") pod \"observability-operator-59bdc8b94-v2lwp\" (UID: \"307585d5-5ed8-43df-b5d8-977729339610\") " pod="openshift-operators/observability-operator-59bdc8b94-v2lwp" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.324525 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-6pl7f" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.395398 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-mk4lp"] Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.396556 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-mk4lp" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.400696 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-2q4sx" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.416283 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-mk4lp"] Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.417024 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/307585d5-5ed8-43df-b5d8-977729339610-observability-operator-tls\") pod \"observability-operator-59bdc8b94-v2lwp\" (UID: \"307585d5-5ed8-43df-b5d8-977729339610\") " pod="openshift-operators/observability-operator-59bdc8b94-v2lwp" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.417094 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlmv9\" (UniqueName: \"kubernetes.io/projected/307585d5-5ed8-43df-b5d8-977729339610-kube-api-access-zlmv9\") pod \"observability-operator-59bdc8b94-v2lwp\" (UID: \"307585d5-5ed8-43df-b5d8-977729339610\") " pod="openshift-operators/observability-operator-59bdc8b94-v2lwp" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.434292 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/307585d5-5ed8-43df-b5d8-977729339610-observability-operator-tls\") pod \"observability-operator-59bdc8b94-v2lwp\" (UID: \"307585d5-5ed8-43df-b5d8-977729339610\") " pod="openshift-operators/observability-operator-59bdc8b94-v2lwp" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.441969 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlmv9\" (UniqueName: \"kubernetes.io/projected/307585d5-5ed8-43df-b5d8-977729339610-kube-api-access-zlmv9\") pod \"observability-operator-59bdc8b94-v2lwp\" (UID: \"307585d5-5ed8-43df-b5d8-977729339610\") " pod="openshift-operators/observability-operator-59bdc8b94-v2lwp" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.470457 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm"] Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.471419 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.477433 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.482221 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm"] Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.523251 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4swr\" (UniqueName: \"kubernetes.io/projected/3b110234-d36d-4ced-a2be-7913bbb84d2a-kube-api-access-z4swr\") pod \"perses-operator-5bf474d74f-mk4lp\" (UID: \"3b110234-d36d-4ced-a2be-7913bbb84d2a\") " pod="openshift-operators/perses-operator-5bf474d74f-mk4lp" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.523336 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/3b110234-d36d-4ced-a2be-7913bbb84d2a-openshift-service-ca\") pod \"perses-operator-5bf474d74f-mk4lp\" (UID: \"3b110234-d36d-4ced-a2be-7913bbb84d2a\") " pod="openshift-operators/perses-operator-5bf474d74f-mk4lp" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.523262 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-v2lwp" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.624860 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4swr\" (UniqueName: \"kubernetes.io/projected/3b110234-d36d-4ced-a2be-7913bbb84d2a-kube-api-access-z4swr\") pod \"perses-operator-5bf474d74f-mk4lp\" (UID: \"3b110234-d36d-4ced-a2be-7913bbb84d2a\") " pod="openshift-operators/perses-operator-5bf474d74f-mk4lp" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.625383 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm\" (UID: \"ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.625410 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm\" (UID: \"ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.625466 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/3b110234-d36d-4ced-a2be-7913bbb84d2a-openshift-service-ca\") pod \"perses-operator-5bf474d74f-mk4lp\" (UID: \"3b110234-d36d-4ced-a2be-7913bbb84d2a\") " pod="openshift-operators/perses-operator-5bf474d74f-mk4lp" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.625615 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2m9m\" (UniqueName: \"kubernetes.io/projected/ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9-kube-api-access-l2m9m\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm\" (UID: \"ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.626737 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/3b110234-d36d-4ced-a2be-7913bbb84d2a-openshift-service-ca\") pod \"perses-operator-5bf474d74f-mk4lp\" (UID: \"3b110234-d36d-4ced-a2be-7913bbb84d2a\") " pod="openshift-operators/perses-operator-5bf474d74f-mk4lp" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.644970 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4swr\" (UniqueName: \"kubernetes.io/projected/3b110234-d36d-4ced-a2be-7913bbb84d2a-kube-api-access-z4swr\") pod \"perses-operator-5bf474d74f-mk4lp\" (UID: \"3b110234-d36d-4ced-a2be-7913bbb84d2a\") " pod="openshift-operators/perses-operator-5bf474d74f-mk4lp" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.726311 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2m9m\" (UniqueName: \"kubernetes.io/projected/ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9-kube-api-access-l2m9m\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm\" (UID: \"ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.726387 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm\" (UID: \"ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.726418 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm\" (UID: \"ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.726837 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm\" (UID: \"ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.726938 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm\" (UID: \"ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.742814 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2m9m\" (UniqueName: \"kubernetes.io/projected/ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9-kube-api-access-l2m9m\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm\" (UID: \"ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.750029 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-mk4lp" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.775422 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-rw6pj"] Feb 17 00:16:26 crc kubenswrapper[4791]: W0217 00:16:26.784933 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf73f7b40_6611_465e_ae69_d2f70ce77651.slice/crio-9776249e75c7322d3271cc3a06e82656659fc6ffea4e5b1adcea27d5b6735971 WatchSource:0}: Error finding container 9776249e75c7322d3271cc3a06e82656659fc6ffea4e5b1adcea27d5b6735971: Status 404 returned error can't find the container with id 9776249e75c7322d3271cc3a06e82656659fc6ffea4e5b1adcea27d5b6735971 Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.826523 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.856146 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-6pl7f"] Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.866177 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-drxh5"] Feb 17 00:16:26 crc kubenswrapper[4791]: W0217 00:16:26.879091 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb046e97f_6343_4e3f_ae0a_0fb40687d992.slice/crio-5bcb3d7b025d938474ab7ed02a2f5e637b5fbbd4db988c189618bba5f0df3570 WatchSource:0}: Error finding container 5bcb3d7b025d938474ab7ed02a2f5e637b5fbbd4db988c189618bba5f0df3570: Status 404 returned error can't find the container with id 5bcb3d7b025d938474ab7ed02a2f5e637b5fbbd4db988c189618bba5f0df3570 Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.894011 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rw6pj" event={"ID":"f73f7b40-6611-465e-ae69-d2f70ce77651","Type":"ContainerStarted","Data":"9776249e75c7322d3271cc3a06e82656659fc6ffea4e5b1adcea27d5b6735971"} Feb 17 00:16:26 crc kubenswrapper[4791]: W0217 00:16:26.900271 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c43370f_07b8_4f84_b716_34af90be5850.slice/crio-9d61bd01e2adcc639d1e246263bdf0b1f0f730e741c8c87213478e2ef2d6cde3 WatchSource:0}: Error finding container 9d61bd01e2adcc639d1e246263bdf0b1f0f730e741c8c87213478e2ef2d6cde3: Status 404 returned error can't find the container with id 9d61bd01e2adcc639d1e246263bdf0b1f0f730e741c8c87213478e2ef2d6cde3 Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.995804 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-mk4lp"] Feb 17 00:16:27 crc kubenswrapper[4791]: I0217 00:16:27.023679 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-v2lwp"] Feb 17 00:16:27 crc kubenswrapper[4791]: I0217 00:16:27.093044 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm"] Feb 17 00:16:27 crc kubenswrapper[4791]: W0217 00:16:27.114707 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba03d7dd_7e00_4b21_a86b_a2cabeb36ed9.slice/crio-d63afa2231bbe954c8b70ba459a87aaba8077722339c934771fc84e62117d1c8 WatchSource:0}: Error finding container d63afa2231bbe954c8b70ba459a87aaba8077722339c934771fc84e62117d1c8: Status 404 returned error can't find the container with id d63afa2231bbe954c8b70ba459a87aaba8077722339c934771fc84e62117d1c8 Feb 17 00:16:27 crc kubenswrapper[4791]: I0217 00:16:27.901966 4791 generic.go:334] "Generic (PLEG): container finished" podID="ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9" containerID="b48607b7d89d41c7ceb2f3bd92cc56bbed0f0f6540297a05fb55091429fcd5da" exitCode=0 Feb 17 00:16:27 crc kubenswrapper[4791]: I0217 00:16:27.902091 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm" event={"ID":"ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9","Type":"ContainerDied","Data":"b48607b7d89d41c7ceb2f3bd92cc56bbed0f0f6540297a05fb55091429fcd5da"} Feb 17 00:16:27 crc kubenswrapper[4791]: I0217 00:16:27.902371 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm" event={"ID":"ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9","Type":"ContainerStarted","Data":"d63afa2231bbe954c8b70ba459a87aaba8077722339c934771fc84e62117d1c8"} Feb 17 00:16:27 crc kubenswrapper[4791]: I0217 00:16:27.904374 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-6pl7f" event={"ID":"b046e97f-6343-4e3f-ae0a-0fb40687d992","Type":"ContainerStarted","Data":"5bcb3d7b025d938474ab7ed02a2f5e637b5fbbd4db988c189618bba5f0df3570"} Feb 17 00:16:27 crc kubenswrapper[4791]: I0217 00:16:27.905780 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-v2lwp" event={"ID":"307585d5-5ed8-43df-b5d8-977729339610","Type":"ContainerStarted","Data":"dd98b123aef2a50077000000d95f9b2e8173ce13179a87aa091eaff166d7f997"} Feb 17 00:16:27 crc kubenswrapper[4791]: I0217 00:16:27.907989 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-drxh5" event={"ID":"8c43370f-07b8-4f84-b716-34af90be5850","Type":"ContainerStarted","Data":"9d61bd01e2adcc639d1e246263bdf0b1f0f730e741c8c87213478e2ef2d6cde3"} Feb 17 00:16:27 crc kubenswrapper[4791]: I0217 00:16:27.909984 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-mk4lp" event={"ID":"3b110234-d36d-4ced-a2be-7913bbb84d2a","Type":"ContainerStarted","Data":"7e21575c377cff335ed8fe1114a7af3a0c188c3c4805ddaa4207691a89b4097d"} Feb 17 00:16:33 crc kubenswrapper[4791]: I0217 00:16:33.657315 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elastic-operator-d5d58ff4c-lwcwp"] Feb 17 00:16:33 crc kubenswrapper[4791]: I0217 00:16:33.659463 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-d5d58ff4c-lwcwp" Feb 17 00:16:33 crc kubenswrapper[4791]: I0217 00:16:33.664013 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"openshift-service-ca.crt" Feb 17 00:16:33 crc kubenswrapper[4791]: I0217 00:16:33.664397 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"kube-root-ca.crt" Feb 17 00:16:33 crc kubenswrapper[4791]: I0217 00:16:33.664607 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-dockercfg-6pnfb" Feb 17 00:16:33 crc kubenswrapper[4791]: I0217 00:16:33.665509 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-service-cert" Feb 17 00:16:33 crc kubenswrapper[4791]: I0217 00:16:33.673783 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-d5d58ff4c-lwcwp"] Feb 17 00:16:33 crc kubenswrapper[4791]: I0217 00:16:33.736732 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9d24e1c4-bdcf-4ffa-8138-b1fb47410471-apiservice-cert\") pod \"elastic-operator-d5d58ff4c-lwcwp\" (UID: \"9d24e1c4-bdcf-4ffa-8138-b1fb47410471\") " pod="service-telemetry/elastic-operator-d5d58ff4c-lwcwp" Feb 17 00:16:33 crc kubenswrapper[4791]: I0217 00:16:33.736825 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9d24e1c4-bdcf-4ffa-8138-b1fb47410471-webhook-cert\") pod \"elastic-operator-d5d58ff4c-lwcwp\" (UID: \"9d24e1c4-bdcf-4ffa-8138-b1fb47410471\") " pod="service-telemetry/elastic-operator-d5d58ff4c-lwcwp" Feb 17 00:16:33 crc kubenswrapper[4791]: I0217 00:16:33.736900 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cswvd\" (UniqueName: \"kubernetes.io/projected/9d24e1c4-bdcf-4ffa-8138-b1fb47410471-kube-api-access-cswvd\") pod \"elastic-operator-d5d58ff4c-lwcwp\" (UID: \"9d24e1c4-bdcf-4ffa-8138-b1fb47410471\") " pod="service-telemetry/elastic-operator-d5d58ff4c-lwcwp" Feb 17 00:16:33 crc kubenswrapper[4791]: I0217 00:16:33.838090 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9d24e1c4-bdcf-4ffa-8138-b1fb47410471-apiservice-cert\") pod \"elastic-operator-d5d58ff4c-lwcwp\" (UID: \"9d24e1c4-bdcf-4ffa-8138-b1fb47410471\") " pod="service-telemetry/elastic-operator-d5d58ff4c-lwcwp" Feb 17 00:16:33 crc kubenswrapper[4791]: I0217 00:16:33.838191 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9d24e1c4-bdcf-4ffa-8138-b1fb47410471-webhook-cert\") pod \"elastic-operator-d5d58ff4c-lwcwp\" (UID: \"9d24e1c4-bdcf-4ffa-8138-b1fb47410471\") " pod="service-telemetry/elastic-operator-d5d58ff4c-lwcwp" Feb 17 00:16:33 crc kubenswrapper[4791]: I0217 00:16:33.839063 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cswvd\" (UniqueName: \"kubernetes.io/projected/9d24e1c4-bdcf-4ffa-8138-b1fb47410471-kube-api-access-cswvd\") pod \"elastic-operator-d5d58ff4c-lwcwp\" (UID: \"9d24e1c4-bdcf-4ffa-8138-b1fb47410471\") " pod="service-telemetry/elastic-operator-d5d58ff4c-lwcwp" Feb 17 00:16:33 crc kubenswrapper[4791]: I0217 00:16:33.844650 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9d24e1c4-bdcf-4ffa-8138-b1fb47410471-webhook-cert\") pod \"elastic-operator-d5d58ff4c-lwcwp\" (UID: \"9d24e1c4-bdcf-4ffa-8138-b1fb47410471\") " pod="service-telemetry/elastic-operator-d5d58ff4c-lwcwp" Feb 17 00:16:33 crc kubenswrapper[4791]: I0217 00:16:33.844792 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9d24e1c4-bdcf-4ffa-8138-b1fb47410471-apiservice-cert\") pod \"elastic-operator-d5d58ff4c-lwcwp\" (UID: \"9d24e1c4-bdcf-4ffa-8138-b1fb47410471\") " pod="service-telemetry/elastic-operator-d5d58ff4c-lwcwp" Feb 17 00:16:33 crc kubenswrapper[4791]: I0217 00:16:33.854235 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cswvd\" (UniqueName: \"kubernetes.io/projected/9d24e1c4-bdcf-4ffa-8138-b1fb47410471-kube-api-access-cswvd\") pod \"elastic-operator-d5d58ff4c-lwcwp\" (UID: \"9d24e1c4-bdcf-4ffa-8138-b1fb47410471\") " pod="service-telemetry/elastic-operator-d5d58ff4c-lwcwp" Feb 17 00:16:33 crc kubenswrapper[4791]: I0217 00:16:33.984877 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-d5d58ff4c-lwcwp" Feb 17 00:16:36 crc kubenswrapper[4791]: I0217 00:16:36.355507 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-8j89k"] Feb 17 00:16:36 crc kubenswrapper[4791]: I0217 00:16:36.356360 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-5bb49f789d-8j89k" Feb 17 00:16:36 crc kubenswrapper[4791]: I0217 00:16:36.358521 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"interconnect-operator-dockercfg-2fxsh" Feb 17 00:16:36 crc kubenswrapper[4791]: I0217 00:16:36.368661 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-8j89k"] Feb 17 00:16:36 crc kubenswrapper[4791]: I0217 00:16:36.474558 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k65np\" (UniqueName: \"kubernetes.io/projected/d51ceaf8-c8f2-4dc0-bbca-35d3562dea95-kube-api-access-k65np\") pod \"interconnect-operator-5bb49f789d-8j89k\" (UID: \"d51ceaf8-c8f2-4dc0-bbca-35d3562dea95\") " pod="service-telemetry/interconnect-operator-5bb49f789d-8j89k" Feb 17 00:16:36 crc kubenswrapper[4791]: I0217 00:16:36.575595 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k65np\" (UniqueName: \"kubernetes.io/projected/d51ceaf8-c8f2-4dc0-bbca-35d3562dea95-kube-api-access-k65np\") pod \"interconnect-operator-5bb49f789d-8j89k\" (UID: \"d51ceaf8-c8f2-4dc0-bbca-35d3562dea95\") " pod="service-telemetry/interconnect-operator-5bb49f789d-8j89k" Feb 17 00:16:36 crc kubenswrapper[4791]: I0217 00:16:36.597964 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k65np\" (UniqueName: \"kubernetes.io/projected/d51ceaf8-c8f2-4dc0-bbca-35d3562dea95-kube-api-access-k65np\") pod \"interconnect-operator-5bb49f789d-8j89k\" (UID: \"d51ceaf8-c8f2-4dc0-bbca-35d3562dea95\") " pod="service-telemetry/interconnect-operator-5bb49f789d-8j89k" Feb 17 00:16:36 crc kubenswrapper[4791]: I0217 00:16:36.674480 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-5bb49f789d-8j89k" Feb 17 00:16:39 crc kubenswrapper[4791]: I0217 00:16:39.496723 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-8j89k"] Feb 17 00:16:39 crc kubenswrapper[4791]: I0217 00:16:39.611934 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-d5d58ff4c-lwcwp"] Feb 17 00:16:39 crc kubenswrapper[4791]: W0217 00:16:39.612396 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d24e1c4_bdcf_4ffa_8138_b1fb47410471.slice/crio-c93bc1a6cc722ea7bd8eb36ee6061b674a436fdc2a39e0ef0ae670b4fd379bb4 WatchSource:0}: Error finding container c93bc1a6cc722ea7bd8eb36ee6061b674a436fdc2a39e0ef0ae670b4fd379bb4: Status 404 returned error can't find the container with id c93bc1a6cc722ea7bd8eb36ee6061b674a436fdc2a39e0ef0ae670b4fd379bb4 Feb 17 00:16:40 crc kubenswrapper[4791]: I0217 00:16:40.002367 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rw6pj" event={"ID":"f73f7b40-6611-465e-ae69-d2f70ce77651","Type":"ContainerStarted","Data":"ad91fe460633ada4a8aca473c899623845f90dd0d80c637ca4a65fff008e362b"} Feb 17 00:16:40 crc kubenswrapper[4791]: I0217 00:16:40.004583 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-mk4lp" event={"ID":"3b110234-d36d-4ced-a2be-7913bbb84d2a","Type":"ContainerStarted","Data":"ab9e75355363567b2b1284b61535372322efcbdc9ba8d2ff20a1a2b2079a47e3"} Feb 17 00:16:40 crc kubenswrapper[4791]: I0217 00:16:40.004668 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-mk4lp" Feb 17 00:16:40 crc kubenswrapper[4791]: I0217 00:16:40.005851 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-5bb49f789d-8j89k" event={"ID":"d51ceaf8-c8f2-4dc0-bbca-35d3562dea95","Type":"ContainerStarted","Data":"077c74483fe73e7c70896149bb63347cf7a239abea188efe5a2c96f70eef4a9b"} Feb 17 00:16:40 crc kubenswrapper[4791]: I0217 00:16:40.009158 4791 generic.go:334] "Generic (PLEG): container finished" podID="ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9" containerID="1611fbf3a08f12bfe96c53f7f199ba9f08cb8d6e0c390dadcd956fd2bbaf7a18" exitCode=0 Feb 17 00:16:40 crc kubenswrapper[4791]: I0217 00:16:40.009230 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm" event={"ID":"ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9","Type":"ContainerDied","Data":"1611fbf3a08f12bfe96c53f7f199ba9f08cb8d6e0c390dadcd956fd2bbaf7a18"} Feb 17 00:16:40 crc kubenswrapper[4791]: I0217 00:16:40.011445 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-6pl7f" event={"ID":"b046e97f-6343-4e3f-ae0a-0fb40687d992","Type":"ContainerStarted","Data":"2da4dd72dcd1c35bb0860091ac2cf49de3b25ff54851351b13dd6b08718b9245"} Feb 17 00:16:40 crc kubenswrapper[4791]: I0217 00:16:40.012775 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-d5d58ff4c-lwcwp" event={"ID":"9d24e1c4-bdcf-4ffa-8138-b1fb47410471","Type":"ContainerStarted","Data":"c93bc1a6cc722ea7bd8eb36ee6061b674a436fdc2a39e0ef0ae670b4fd379bb4"} Feb 17 00:16:40 crc kubenswrapper[4791]: I0217 00:16:40.014620 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-v2lwp" event={"ID":"307585d5-5ed8-43df-b5d8-977729339610","Type":"ContainerStarted","Data":"106165e9d57803770c869270b914e95e426ac34ea14d8df1561b13b04049e1fc"} Feb 17 00:16:40 crc kubenswrapper[4791]: I0217 00:16:40.014808 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-v2lwp" Feb 17 00:16:40 crc kubenswrapper[4791]: I0217 00:16:40.016907 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-drxh5" event={"ID":"8c43370f-07b8-4f84-b716-34af90be5850","Type":"ContainerStarted","Data":"fc7ee0cb4835fd7f4c48103faddc9dcf102c386ddfd16779326abf98fe254ee1"} Feb 17 00:16:40 crc kubenswrapper[4791]: I0217 00:16:40.018108 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-v2lwp" Feb 17 00:16:40 crc kubenswrapper[4791]: I0217 00:16:40.037132 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rw6pj" podStartSLOduration=2.664603978 podStartE2EDuration="15.037098715s" podCreationTimestamp="2026-02-17 00:16:25 +0000 UTC" firstStartedPulling="2026-02-17 00:16:26.787844471 +0000 UTC m=+644.267356998" lastFinishedPulling="2026-02-17 00:16:39.160339208 +0000 UTC m=+656.639851735" observedRunningTime="2026-02-17 00:16:40.032553643 +0000 UTC m=+657.512066170" watchObservedRunningTime="2026-02-17 00:16:40.037098715 +0000 UTC m=+657.516611242" Feb 17 00:16:40 crc kubenswrapper[4791]: I0217 00:16:40.079641 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-mk4lp" podStartSLOduration=1.927164462 podStartE2EDuration="14.079625017s" podCreationTimestamp="2026-02-17 00:16:26 +0000 UTC" firstStartedPulling="2026-02-17 00:16:27.009223695 +0000 UTC m=+644.488736212" lastFinishedPulling="2026-02-17 00:16:39.16168424 +0000 UTC m=+656.641196767" observedRunningTime="2026-02-17 00:16:40.075508869 +0000 UTC m=+657.555021396" watchObservedRunningTime="2026-02-17 00:16:40.079625017 +0000 UTC m=+657.559137544" Feb 17 00:16:40 crc kubenswrapper[4791]: I0217 00:16:40.091348 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-6pl7f" podStartSLOduration=2.896557962 podStartE2EDuration="15.091329583s" podCreationTimestamp="2026-02-17 00:16:25 +0000 UTC" firstStartedPulling="2026-02-17 00:16:26.890756425 +0000 UTC m=+644.370268952" lastFinishedPulling="2026-02-17 00:16:39.085528046 +0000 UTC m=+656.565040573" observedRunningTime="2026-02-17 00:16:40.088920638 +0000 UTC m=+657.568433165" watchObservedRunningTime="2026-02-17 00:16:40.091329583 +0000 UTC m=+657.570842110" Feb 17 00:16:40 crc kubenswrapper[4791]: I0217 00:16:40.141999 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-v2lwp" podStartSLOduration=2.012314184 podStartE2EDuration="14.141980339s" podCreationTimestamp="2026-02-17 00:16:26 +0000 UTC" firstStartedPulling="2026-02-17 00:16:27.03854344 +0000 UTC m=+644.518055967" lastFinishedPulling="2026-02-17 00:16:39.168209605 +0000 UTC m=+656.647722122" observedRunningTime="2026-02-17 00:16:40.121545109 +0000 UTC m=+657.601057636" watchObservedRunningTime="2026-02-17 00:16:40.141980339 +0000 UTC m=+657.621492866" Feb 17 00:16:41 crc kubenswrapper[4791]: I0217 00:16:41.025045 4791 generic.go:334] "Generic (PLEG): container finished" podID="ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9" containerID="66476494ffaa29bc01d1efdd565e3f492af17843ab79206fd7d8c429a6feeb76" exitCode=0 Feb 17 00:16:41 crc kubenswrapper[4791]: I0217 00:16:41.025168 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm" event={"ID":"ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9","Type":"ContainerDied","Data":"66476494ffaa29bc01d1efdd565e3f492af17843ab79206fd7d8c429a6feeb76"} Feb 17 00:16:41 crc kubenswrapper[4791]: I0217 00:16:41.041336 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-drxh5" podStartSLOduration=3.873131483 podStartE2EDuration="16.041310774s" podCreationTimestamp="2026-02-17 00:16:25 +0000 UTC" firstStartedPulling="2026-02-17 00:16:26.917335745 +0000 UTC m=+644.396848272" lastFinishedPulling="2026-02-17 00:16:39.085515036 +0000 UTC m=+656.565027563" observedRunningTime="2026-02-17 00:16:40.143007311 +0000 UTC m=+657.622519828" watchObservedRunningTime="2026-02-17 00:16:41.041310774 +0000 UTC m=+658.520823321" Feb 17 00:16:42 crc kubenswrapper[4791]: I0217 00:16:42.637961 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm" Feb 17 00:16:42 crc kubenswrapper[4791]: I0217 00:16:42.662379 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2m9m\" (UniqueName: \"kubernetes.io/projected/ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9-kube-api-access-l2m9m\") pod \"ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9\" (UID: \"ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9\") " Feb 17 00:16:42 crc kubenswrapper[4791]: I0217 00:16:42.662442 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9-bundle\") pod \"ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9\" (UID: \"ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9\") " Feb 17 00:16:42 crc kubenswrapper[4791]: I0217 00:16:42.662474 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9-util\") pod \"ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9\" (UID: \"ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9\") " Feb 17 00:16:42 crc kubenswrapper[4791]: I0217 00:16:42.664284 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9-bundle" (OuterVolumeSpecName: "bundle") pod "ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9" (UID: "ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:16:42 crc kubenswrapper[4791]: I0217 00:16:42.674557 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9-util" (OuterVolumeSpecName: "util") pod "ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9" (UID: "ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:16:42 crc kubenswrapper[4791]: I0217 00:16:42.702399 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9-kube-api-access-l2m9m" (OuterVolumeSpecName: "kube-api-access-l2m9m") pod "ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9" (UID: "ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9"). InnerVolumeSpecName "kube-api-access-l2m9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:16:42 crc kubenswrapper[4791]: I0217 00:16:42.763449 4791 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 00:16:42 crc kubenswrapper[4791]: I0217 00:16:42.763483 4791 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9-util\") on node \"crc\" DevicePath \"\"" Feb 17 00:16:42 crc kubenswrapper[4791]: I0217 00:16:42.763496 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2m9m\" (UniqueName: \"kubernetes.io/projected/ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9-kube-api-access-l2m9m\") on node \"crc\" DevicePath \"\"" Feb 17 00:16:43 crc kubenswrapper[4791]: I0217 00:16:43.048713 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm" event={"ID":"ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9","Type":"ContainerDied","Data":"d63afa2231bbe954c8b70ba459a87aaba8077722339c934771fc84e62117d1c8"} Feb 17 00:16:43 crc kubenswrapper[4791]: I0217 00:16:43.049013 4791 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d63afa2231bbe954c8b70ba459a87aaba8077722339c934771fc84e62117d1c8" Feb 17 00:16:43 crc kubenswrapper[4791]: I0217 00:16:43.048764 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm" Feb 17 00:16:43 crc kubenswrapper[4791]: I0217 00:16:43.051008 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-d5d58ff4c-lwcwp" event={"ID":"9d24e1c4-bdcf-4ffa-8138-b1fb47410471","Type":"ContainerStarted","Data":"cb642cde62bb6e3f4561182cda6bac3b7a21c796661f905fb3d1be66aa26d173"} Feb 17 00:16:43 crc kubenswrapper[4791]: I0217 00:16:43.250231 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elastic-operator-d5d58ff4c-lwcwp" podStartSLOduration=7.179944458 podStartE2EDuration="10.250209924s" podCreationTimestamp="2026-02-17 00:16:33 +0000 UTC" firstStartedPulling="2026-02-17 00:16:39.616396446 +0000 UTC m=+657.095908973" lastFinishedPulling="2026-02-17 00:16:42.686661912 +0000 UTC m=+660.166174439" observedRunningTime="2026-02-17 00:16:43.076866137 +0000 UTC m=+660.556378664" watchObservedRunningTime="2026-02-17 00:16:43.250209924 +0000 UTC m=+660.729722461" Feb 17 00:16:46 crc kubenswrapper[4791]: I0217 00:16:46.752833 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-mk4lp" Feb 17 00:16:49 crc kubenswrapper[4791]: I0217 00:16:49.094334 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-5bb49f789d-8j89k" event={"ID":"d51ceaf8-c8f2-4dc0-bbca-35d3562dea95","Type":"ContainerStarted","Data":"7e92305fd7cfeea49afe034fa0bc1eeb102bcf954c1a494f42aa6d1a4406956d"} Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.379065 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/interconnect-operator-5bb49f789d-8j89k" podStartSLOduration=13.884736432 podStartE2EDuration="22.379049082s" podCreationTimestamp="2026-02-17 00:16:36 +0000 UTC" firstStartedPulling="2026-02-17 00:16:39.50283168 +0000 UTC m=+656.982344207" lastFinishedPulling="2026-02-17 00:16:47.99714433 +0000 UTC m=+665.476656857" observedRunningTime="2026-02-17 00:16:49.113220379 +0000 UTC m=+666.592732906" watchObservedRunningTime="2026-02-17 00:16:58.379049082 +0000 UTC m=+675.858561609" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.382278 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Feb 17 00:16:58 crc kubenswrapper[4791]: E0217 00:16:58.383151 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9" containerName="pull" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.383170 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9" containerName="pull" Feb 17 00:16:58 crc kubenswrapper[4791]: E0217 00:16:58.383179 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9" containerName="util" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.383186 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9" containerName="util" Feb 17 00:16:58 crc kubenswrapper[4791]: E0217 00:16:58.383202 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9" containerName="extract" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.383209 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9" containerName="extract" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.383490 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9" containerName="extract" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.384216 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.385374 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-internal-users" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.387869 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-remote-ca" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.388438 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-http-certs-internal" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.388500 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-dockercfg-9vtp6" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.388750 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-config" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.388936 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-unicast-hosts" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.389062 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-xpack-file-realm" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.389184 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-scripts" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.401957 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-transport-certs" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.408383 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.471234 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.471277 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/335ade17-e7c1-487c-9e12-ad3d0d3610b0-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.471297 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.471319 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.471427 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.471576 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.471659 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.471726 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.471748 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.471769 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.471793 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.471879 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.471945 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/335ade17-e7c1-487c-9e12-ad3d0d3610b0-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.472020 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.472049 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.573310 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.573362 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/335ade17-e7c1-487c-9e12-ad3d0d3610b0-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.573390 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.573604 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.573623 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.573649 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.573675 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.573700 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.573717 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.573734 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.573756 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.573774 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.573795 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/335ade17-e7c1-487c-9e12-ad3d0d3610b0-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.573824 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.573842 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.574195 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.574195 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/335ade17-e7c1-487c-9e12-ad3d0d3610b0-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.574309 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.574451 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.574781 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.575169 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.575186 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.575922 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.583337 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.584179 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/335ade17-e7c1-487c-9e12-ad3d0d3610b0-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.584819 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.585021 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.587851 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.591779 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.599741 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.701520 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:59 crc kubenswrapper[4791]: I0217 00:16:59.015077 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Feb 17 00:16:59 crc kubenswrapper[4791]: I0217 00:16:59.153832 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"335ade17-e7c1-487c-9e12-ad3d0d3610b0","Type":"ContainerStarted","Data":"88d6b483b0014d6e616311beef6b916ce80ae994c03f1b13e031beed251561a5"} Feb 17 00:17:04 crc kubenswrapper[4791]: I0217 00:17:04.120057 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-wbhjm"] Feb 17 00:17:04 crc kubenswrapper[4791]: I0217 00:17:04.121314 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-wbhjm" Feb 17 00:17:04 crc kubenswrapper[4791]: I0217 00:17:04.124474 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Feb 17 00:17:04 crc kubenswrapper[4791]: I0217 00:17:04.124534 4791 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-lcc8z" Feb 17 00:17:04 crc kubenswrapper[4791]: I0217 00:17:04.124683 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Feb 17 00:17:04 crc kubenswrapper[4791]: I0217 00:17:04.132786 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-wbhjm"] Feb 17 00:17:04 crc kubenswrapper[4791]: I0217 00:17:04.177914 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4bc89f8e-fb5f-4ba2-826a-d93c8a11383c-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-wbhjm\" (UID: \"4bc89f8e-fb5f-4ba2-826a-d93c8a11383c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-wbhjm" Feb 17 00:17:04 crc kubenswrapper[4791]: I0217 00:17:04.178206 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9qts\" (UniqueName: \"kubernetes.io/projected/4bc89f8e-fb5f-4ba2-826a-d93c8a11383c-kube-api-access-s9qts\") pod \"cert-manager-operator-controller-manager-5586865c96-wbhjm\" (UID: \"4bc89f8e-fb5f-4ba2-826a-d93c8a11383c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-wbhjm" Feb 17 00:17:04 crc kubenswrapper[4791]: I0217 00:17:04.279777 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4bc89f8e-fb5f-4ba2-826a-d93c8a11383c-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-wbhjm\" (UID: \"4bc89f8e-fb5f-4ba2-826a-d93c8a11383c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-wbhjm" Feb 17 00:17:04 crc kubenswrapper[4791]: I0217 00:17:04.279835 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9qts\" (UniqueName: \"kubernetes.io/projected/4bc89f8e-fb5f-4ba2-826a-d93c8a11383c-kube-api-access-s9qts\") pod \"cert-manager-operator-controller-manager-5586865c96-wbhjm\" (UID: \"4bc89f8e-fb5f-4ba2-826a-d93c8a11383c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-wbhjm" Feb 17 00:17:04 crc kubenswrapper[4791]: I0217 00:17:04.280951 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4bc89f8e-fb5f-4ba2-826a-d93c8a11383c-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-wbhjm\" (UID: \"4bc89f8e-fb5f-4ba2-826a-d93c8a11383c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-wbhjm" Feb 17 00:17:04 crc kubenswrapper[4791]: I0217 00:17:04.302491 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9qts\" (UniqueName: \"kubernetes.io/projected/4bc89f8e-fb5f-4ba2-826a-d93c8a11383c-kube-api-access-s9qts\") pod \"cert-manager-operator-controller-manager-5586865c96-wbhjm\" (UID: \"4bc89f8e-fb5f-4ba2-826a-d93c8a11383c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-wbhjm" Feb 17 00:17:04 crc kubenswrapper[4791]: I0217 00:17:04.442620 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-wbhjm" Feb 17 00:17:04 crc kubenswrapper[4791]: I0217 00:17:04.694845 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-wbhjm"] Feb 17 00:17:04 crc kubenswrapper[4791]: W0217 00:17:04.707541 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4bc89f8e_fb5f_4ba2_826a_d93c8a11383c.slice/crio-3103e33386c43114e32dea6a09ec6c279543d07e1076b472e821c396a127b54e WatchSource:0}: Error finding container 3103e33386c43114e32dea6a09ec6c279543d07e1076b472e821c396a127b54e: Status 404 returned error can't find the container with id 3103e33386c43114e32dea6a09ec6c279543d07e1076b472e821c396a127b54e Feb 17 00:17:05 crc kubenswrapper[4791]: I0217 00:17:05.191963 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-wbhjm" event={"ID":"4bc89f8e-fb5f-4ba2-826a-d93c8a11383c","Type":"ContainerStarted","Data":"3103e33386c43114e32dea6a09ec6c279543d07e1076b472e821c396a127b54e"} Feb 17 00:17:15 crc kubenswrapper[4791]: E0217 00:17:15.895979 4791 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="registry.connect.redhat.com/elastic/elasticsearch:7.17.20" Feb 17 00:17:15 crc kubenswrapper[4791]: E0217 00:17:15.896750 4791 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:elastic-internal-init-filesystem,Image:registry.connect.redhat.com/elastic/elasticsearch:7.17.20,Command:[bash -c /mnt/elastic-internal/scripts/prepare-fs.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:HEADLESS_SERVICE_NAME,Value:elasticsearch-es-default,ValueFrom:nil,},EnvVar{Name:PROBE_PASSWORD_PATH,Value:/mnt/elastic-internal/pod-mounted-users/elastic-internal-probe,ValueFrom:nil,},EnvVar{Name:PROBE_USERNAME,Value:elastic-internal-probe,ValueFrom:nil,},EnvVar{Name:READINESS_PROBE_PROTOCOL,Value:https,ValueFrom:nil,},EnvVar{Name:NSS_SDB_USE_CACHE,Value:no,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:downward-api,ReadOnly:true,MountPath:/mnt/elastic-internal/downward-api,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-bin-local,ReadOnly:false,MountPath:/mnt/elastic-internal/elasticsearch-bin-local,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-config,ReadOnly:true,MountPath:/mnt/elastic-internal/elasticsearch-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-config-local,ReadOnly:false,MountPath:/mnt/elastic-internal/elasticsearch-config-local,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-plugins-local,ReadOnly:false,MountPath:/mnt/elastic-internal/elasticsearch-plugins-local,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-http-certificates,ReadOnly:true,MountPath:/usr/share/elasticsearch/config/http-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-probe-user,ReadOnly:true,MountPath:/mnt/elastic-internal/pod-mounted-users,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-remote-certificate-authorities,ReadOnly:true,MountPath:/usr/share/elasticsearch/config/transport-remote-certs/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-scripts,ReadOnly:true,MountPath:/mnt/elastic-internal/scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-transport-certificates,ReadOnly:true,MountPath:/mnt/elastic-internal/transport-certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-unicast-hosts,ReadOnly:true,MountPath:/mnt/elastic-internal/unicast-hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-xpack-file-realm,ReadOnly:true,MountPath:/mnt/elastic-internal/xpack-file-realm,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elasticsearch-data,ReadOnly:false,MountPath:/usr/share/elasticsearch/data,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elasticsearch-logs,ReadOnly:false,MountPath:/usr/share/elasticsearch/logs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tmp-volume,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod elasticsearch-es-default-0_service-telemetry(335ade17-e7c1-487c-9e12-ad3d0d3610b0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 00:17:15 crc kubenswrapper[4791]: E0217 00:17:15.898024 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="335ade17-e7c1-487c-9e12-ad3d0d3610b0" Feb 17 00:17:16 crc kubenswrapper[4791]: I0217 00:17:16.268312 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-wbhjm" event={"ID":"4bc89f8e-fb5f-4ba2-826a-d93c8a11383c","Type":"ContainerStarted","Data":"efecf6143e91b35e4b5e6220794aa692f62f75c8dd9fa04143d3e2abad00b483"} Feb 17 00:17:16 crc kubenswrapper[4791]: E0217 00:17:16.269604 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.connect.redhat.com/elastic/elasticsearch:7.17.20\\\"\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="335ade17-e7c1-487c-9e12-ad3d0d3610b0" Feb 17 00:17:16 crc kubenswrapper[4791]: I0217 00:17:16.328870 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-wbhjm" podStartSLOduration=1.225165164 podStartE2EDuration="12.328849377s" podCreationTimestamp="2026-02-17 00:17:04 +0000 UTC" firstStartedPulling="2026-02-17 00:17:04.711346958 +0000 UTC m=+682.190859515" lastFinishedPulling="2026-02-17 00:17:15.815031181 +0000 UTC m=+693.294543728" observedRunningTime="2026-02-17 00:17:16.32605878 +0000 UTC m=+693.805571307" watchObservedRunningTime="2026-02-17 00:17:16.328849377 +0000 UTC m=+693.808361904" Feb 17 00:17:16 crc kubenswrapper[4791]: I0217 00:17:16.407728 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Feb 17 00:17:16 crc kubenswrapper[4791]: I0217 00:17:16.442057 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Feb 17 00:17:17 crc kubenswrapper[4791]: E0217 00:17:17.274812 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.connect.redhat.com/elastic/elasticsearch:7.17.20\\\"\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="335ade17-e7c1-487c-9e12-ad3d0d3610b0" Feb 17 00:17:18 crc kubenswrapper[4791]: E0217 00:17:18.278633 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.connect.redhat.com/elastic/elasticsearch:7.17.20\\\"\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="335ade17-e7c1-487c-9e12-ad3d0d3610b0" Feb 17 00:17:18 crc kubenswrapper[4791]: I0217 00:17:18.686101 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-42zhs"] Feb 17 00:17:18 crc kubenswrapper[4791]: I0217 00:17:18.687553 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-42zhs" Feb 17 00:17:18 crc kubenswrapper[4791]: I0217 00:17:18.689943 4791 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-ppxz9" Feb 17 00:17:18 crc kubenswrapper[4791]: I0217 00:17:18.690403 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 17 00:17:18 crc kubenswrapper[4791]: I0217 00:17:18.690546 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 17 00:17:18 crc kubenswrapper[4791]: I0217 00:17:18.695000 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-42zhs"] Feb 17 00:17:18 crc kubenswrapper[4791]: I0217 00:17:18.780532 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4hdw\" (UniqueName: \"kubernetes.io/projected/4b26c415-6a42-4bda-abbd-cf394bc94043-kube-api-access-p4hdw\") pod \"cert-manager-webhook-6888856db4-42zhs\" (UID: \"4b26c415-6a42-4bda-abbd-cf394bc94043\") " pod="cert-manager/cert-manager-webhook-6888856db4-42zhs" Feb 17 00:17:18 crc kubenswrapper[4791]: I0217 00:17:18.780603 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b26c415-6a42-4bda-abbd-cf394bc94043-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-42zhs\" (UID: \"4b26c415-6a42-4bda-abbd-cf394bc94043\") " pod="cert-manager/cert-manager-webhook-6888856db4-42zhs" Feb 17 00:17:18 crc kubenswrapper[4791]: I0217 00:17:18.882051 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b26c415-6a42-4bda-abbd-cf394bc94043-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-42zhs\" (UID: \"4b26c415-6a42-4bda-abbd-cf394bc94043\") " pod="cert-manager/cert-manager-webhook-6888856db4-42zhs" Feb 17 00:17:18 crc kubenswrapper[4791]: I0217 00:17:18.882214 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4hdw\" (UniqueName: \"kubernetes.io/projected/4b26c415-6a42-4bda-abbd-cf394bc94043-kube-api-access-p4hdw\") pod \"cert-manager-webhook-6888856db4-42zhs\" (UID: \"4b26c415-6a42-4bda-abbd-cf394bc94043\") " pod="cert-manager/cert-manager-webhook-6888856db4-42zhs" Feb 17 00:17:18 crc kubenswrapper[4791]: I0217 00:17:18.911377 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4hdw\" (UniqueName: \"kubernetes.io/projected/4b26c415-6a42-4bda-abbd-cf394bc94043-kube-api-access-p4hdw\") pod \"cert-manager-webhook-6888856db4-42zhs\" (UID: \"4b26c415-6a42-4bda-abbd-cf394bc94043\") " pod="cert-manager/cert-manager-webhook-6888856db4-42zhs" Feb 17 00:17:18 crc kubenswrapper[4791]: I0217 00:17:18.923231 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b26c415-6a42-4bda-abbd-cf394bc94043-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-42zhs\" (UID: \"4b26c415-6a42-4bda-abbd-cf394bc94043\") " pod="cert-manager/cert-manager-webhook-6888856db4-42zhs" Feb 17 00:17:19 crc kubenswrapper[4791]: I0217 00:17:19.005153 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-42zhs" Feb 17 00:17:19 crc kubenswrapper[4791]: I0217 00:17:19.252623 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-42zhs"] Feb 17 00:17:19 crc kubenswrapper[4791]: I0217 00:17:19.284540 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-42zhs" event={"ID":"4b26c415-6a42-4bda-abbd-cf394bc94043","Type":"ContainerStarted","Data":"cb83f072908cc41e0fd023cebdc45b8fb2fa757a086a0de47168b6d4c4b95a54"} Feb 17 00:17:22 crc kubenswrapper[4791]: I0217 00:17:22.235302 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-l9798"] Feb 17 00:17:22 crc kubenswrapper[4791]: I0217 00:17:22.236878 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-l9798" Feb 17 00:17:22 crc kubenswrapper[4791]: I0217 00:17:22.239287 4791 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-88mxr" Feb 17 00:17:22 crc kubenswrapper[4791]: I0217 00:17:22.264345 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-l9798"] Feb 17 00:17:22 crc kubenswrapper[4791]: I0217 00:17:22.333411 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll4z9\" (UniqueName: \"kubernetes.io/projected/aca3c38d-0dd8-4457-854a-b392ba180087-kube-api-access-ll4z9\") pod \"cert-manager-cainjector-5545bd876-l9798\" (UID: \"aca3c38d-0dd8-4457-854a-b392ba180087\") " pod="cert-manager/cert-manager-cainjector-5545bd876-l9798" Feb 17 00:17:22 crc kubenswrapper[4791]: I0217 00:17:22.333697 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aca3c38d-0dd8-4457-854a-b392ba180087-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-l9798\" (UID: \"aca3c38d-0dd8-4457-854a-b392ba180087\") " pod="cert-manager/cert-manager-cainjector-5545bd876-l9798" Feb 17 00:17:22 crc kubenswrapper[4791]: I0217 00:17:22.434892 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll4z9\" (UniqueName: \"kubernetes.io/projected/aca3c38d-0dd8-4457-854a-b392ba180087-kube-api-access-ll4z9\") pod \"cert-manager-cainjector-5545bd876-l9798\" (UID: \"aca3c38d-0dd8-4457-854a-b392ba180087\") " pod="cert-manager/cert-manager-cainjector-5545bd876-l9798" Feb 17 00:17:22 crc kubenswrapper[4791]: I0217 00:17:22.434960 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aca3c38d-0dd8-4457-854a-b392ba180087-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-l9798\" (UID: \"aca3c38d-0dd8-4457-854a-b392ba180087\") " pod="cert-manager/cert-manager-cainjector-5545bd876-l9798" Feb 17 00:17:22 crc kubenswrapper[4791]: I0217 00:17:22.455654 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll4z9\" (UniqueName: \"kubernetes.io/projected/aca3c38d-0dd8-4457-854a-b392ba180087-kube-api-access-ll4z9\") pod \"cert-manager-cainjector-5545bd876-l9798\" (UID: \"aca3c38d-0dd8-4457-854a-b392ba180087\") " pod="cert-manager/cert-manager-cainjector-5545bd876-l9798" Feb 17 00:17:22 crc kubenswrapper[4791]: I0217 00:17:22.455908 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aca3c38d-0dd8-4457-854a-b392ba180087-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-l9798\" (UID: \"aca3c38d-0dd8-4457-854a-b392ba180087\") " pod="cert-manager/cert-manager-cainjector-5545bd876-l9798" Feb 17 00:17:22 crc kubenswrapper[4791]: I0217 00:17:22.566578 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-l9798" Feb 17 00:17:24 crc kubenswrapper[4791]: I0217 00:17:24.310907 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-l9798"] Feb 17 00:17:24 crc kubenswrapper[4791]: W0217 00:17:24.321779 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaca3c38d_0dd8_4457_854a_b392ba180087.slice/crio-88f2cf39f5fa160c62d89366d3500d68a88310744e285ff20e90266503f04362 WatchSource:0}: Error finding container 88f2cf39f5fa160c62d89366d3500d68a88310744e285ff20e90266503f04362: Status 404 returned error can't find the container with id 88f2cf39f5fa160c62d89366d3500d68a88310744e285ff20e90266503f04362 Feb 17 00:17:24 crc kubenswrapper[4791]: I0217 00:17:24.334655 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-l9798" event={"ID":"aca3c38d-0dd8-4457-854a-b392ba180087","Type":"ContainerStarted","Data":"88f2cf39f5fa160c62d89366d3500d68a88310744e285ff20e90266503f04362"} Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.343995 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-l9798" event={"ID":"aca3c38d-0dd8-4457-854a-b392ba180087","Type":"ContainerStarted","Data":"32a101c5108440b5174b1a25337b522ac0e2f252243397f619bc2c8bf52eca97"} Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.345834 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-42zhs" event={"ID":"4b26c415-6a42-4bda-abbd-cf394bc94043","Type":"ContainerStarted","Data":"2599d909254c9145c013a9f8b1cccad4b26f21fb9eda93dcc51e01d72be7a590"} Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.345985 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-42zhs" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.467968 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-l9798" podStartSLOduration=3.467950891 podStartE2EDuration="3.467950891s" podCreationTimestamp="2026-02-17 00:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:17:25.39503057 +0000 UTC m=+702.874543097" watchObservedRunningTime="2026-02-17 00:17:25.467950891 +0000 UTC m=+702.947463428" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.469828 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-42zhs" podStartSLOduration=2.572843878 podStartE2EDuration="7.469818661s" podCreationTimestamp="2026-02-17 00:17:18 +0000 UTC" firstStartedPulling="2026-02-17 00:17:19.259602326 +0000 UTC m=+696.739114853" lastFinishedPulling="2026-02-17 00:17:24.156577109 +0000 UTC m=+701.636089636" observedRunningTime="2026-02-17 00:17:25.465489004 +0000 UTC m=+702.945001541" watchObservedRunningTime="2026-02-17 00:17:25.469818661 +0000 UTC m=+702.949331208" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.509662 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.510772 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.514789 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-ca" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.514933 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-sys-config" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.515278 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-global-ca" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.515805 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-kmssz" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.548976 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.591128 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/d8695c88-2448-4593-8029-3ce49d07ca00-builder-dockercfg-kmssz-push\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.591362 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d8695c88-2448-4593-8029-3ce49d07ca00-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.591438 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8695c88-2448-4593-8029-3ce49d07ca00-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.591505 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d8695c88-2448-4593-8029-3ce49d07ca00-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.591569 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d8695c88-2448-4593-8029-3ce49d07ca00-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.591692 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d8695c88-2448-4593-8029-3ce49d07ca00-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.591767 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd6ln\" (UniqueName: \"kubernetes.io/projected/d8695c88-2448-4593-8029-3ce49d07ca00-kube-api-access-kd6ln\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.591845 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d8695c88-2448-4593-8029-3ce49d07ca00-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.591915 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d8695c88-2448-4593-8029-3ce49d07ca00-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.591989 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d8695c88-2448-4593-8029-3ce49d07ca00-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.592085 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/d8695c88-2448-4593-8029-3ce49d07ca00-builder-dockercfg-kmssz-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.592199 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8695c88-2448-4593-8029-3ce49d07ca00-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.693524 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/d8695c88-2448-4593-8029-3ce49d07ca00-builder-dockercfg-kmssz-push\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.693754 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d8695c88-2448-4593-8029-3ce49d07ca00-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.694301 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8695c88-2448-4593-8029-3ce49d07ca00-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.694967 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d8695c88-2448-4593-8029-3ce49d07ca00-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.695342 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d8695c88-2448-4593-8029-3ce49d07ca00-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.695678 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d8695c88-2448-4593-8029-3ce49d07ca00-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.695797 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd6ln\" (UniqueName: \"kubernetes.io/projected/d8695c88-2448-4593-8029-3ce49d07ca00-kube-api-access-kd6ln\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.696122 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d8695c88-2448-4593-8029-3ce49d07ca00-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.696398 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d8695c88-2448-4593-8029-3ce49d07ca00-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.696792 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d8695c88-2448-4593-8029-3ce49d07ca00-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.696927 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/d8695c88-2448-4593-8029-3ce49d07ca00-builder-dockercfg-kmssz-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.697325 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8695c88-2448-4593-8029-3ce49d07ca00-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.694922 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8695c88-2448-4593-8029-3ce49d07ca00-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.696364 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d8695c88-2448-4593-8029-3ce49d07ca00-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.695637 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d8695c88-2448-4593-8029-3ce49d07ca00-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.696762 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d8695c88-2448-4593-8029-3ce49d07ca00-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.694253 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d8695c88-2448-4593-8029-3ce49d07ca00-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.696888 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d8695c88-2448-4593-8029-3ce49d07ca00-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.695769 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d8695c88-2448-4593-8029-3ce49d07ca00-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.695304 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d8695c88-2448-4593-8029-3ce49d07ca00-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.698177 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8695c88-2448-4593-8029-3ce49d07ca00-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.700759 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/d8695c88-2448-4593-8029-3ce49d07ca00-builder-dockercfg-kmssz-push\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.714684 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd6ln\" (UniqueName: \"kubernetes.io/projected/d8695c88-2448-4593-8029-3ce49d07ca00-kube-api-access-kd6ln\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.716055 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/d8695c88-2448-4593-8029-3ce49d07ca00-builder-dockercfg-kmssz-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.823898 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:26 crc kubenswrapper[4791]: I0217 00:17:26.088455 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Feb 17 00:17:26 crc kubenswrapper[4791]: I0217 00:17:26.353794 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"d8695c88-2448-4593-8029-3ce49d07ca00","Type":"ContainerStarted","Data":"f8c7e3abade9be907d19dcea3944ab13d63c479cb28ae61ecc17366d326050c2"} Feb 17 00:17:29 crc kubenswrapper[4791]: I0217 00:17:29.009721 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-42zhs" Feb 17 00:17:29 crc kubenswrapper[4791]: I0217 00:17:29.737008 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-9dsmn"] Feb 17 00:17:29 crc kubenswrapper[4791]: I0217 00:17:29.737913 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-9dsmn" Feb 17 00:17:29 crc kubenswrapper[4791]: I0217 00:17:29.740716 4791 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-g6pq2" Feb 17 00:17:29 crc kubenswrapper[4791]: I0217 00:17:29.749085 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-9dsmn"] Feb 17 00:17:29 crc kubenswrapper[4791]: I0217 00:17:29.850161 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhp7c\" (UniqueName: \"kubernetes.io/projected/bf759390-4034-42c9-811b-531aeabd3ed6-kube-api-access-bhp7c\") pod \"cert-manager-545d4d4674-9dsmn\" (UID: \"bf759390-4034-42c9-811b-531aeabd3ed6\") " pod="cert-manager/cert-manager-545d4d4674-9dsmn" Feb 17 00:17:29 crc kubenswrapper[4791]: I0217 00:17:29.850222 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf759390-4034-42c9-811b-531aeabd3ed6-bound-sa-token\") pod \"cert-manager-545d4d4674-9dsmn\" (UID: \"bf759390-4034-42c9-811b-531aeabd3ed6\") " pod="cert-manager/cert-manager-545d4d4674-9dsmn" Feb 17 00:17:29 crc kubenswrapper[4791]: I0217 00:17:29.951856 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhp7c\" (UniqueName: \"kubernetes.io/projected/bf759390-4034-42c9-811b-531aeabd3ed6-kube-api-access-bhp7c\") pod \"cert-manager-545d4d4674-9dsmn\" (UID: \"bf759390-4034-42c9-811b-531aeabd3ed6\") " pod="cert-manager/cert-manager-545d4d4674-9dsmn" Feb 17 00:17:29 crc kubenswrapper[4791]: I0217 00:17:29.951911 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf759390-4034-42c9-811b-531aeabd3ed6-bound-sa-token\") pod \"cert-manager-545d4d4674-9dsmn\" (UID: \"bf759390-4034-42c9-811b-531aeabd3ed6\") " pod="cert-manager/cert-manager-545d4d4674-9dsmn" Feb 17 00:17:29 crc kubenswrapper[4791]: I0217 00:17:29.979622 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf759390-4034-42c9-811b-531aeabd3ed6-bound-sa-token\") pod \"cert-manager-545d4d4674-9dsmn\" (UID: \"bf759390-4034-42c9-811b-531aeabd3ed6\") " pod="cert-manager/cert-manager-545d4d4674-9dsmn" Feb 17 00:17:29 crc kubenswrapper[4791]: I0217 00:17:29.979878 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhp7c\" (UniqueName: \"kubernetes.io/projected/bf759390-4034-42c9-811b-531aeabd3ed6-kube-api-access-bhp7c\") pod \"cert-manager-545d4d4674-9dsmn\" (UID: \"bf759390-4034-42c9-811b-531aeabd3ed6\") " pod="cert-manager/cert-manager-545d4d4674-9dsmn" Feb 17 00:17:30 crc kubenswrapper[4791]: I0217 00:17:30.066614 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-9dsmn" Feb 17 00:17:33 crc kubenswrapper[4791]: I0217 00:17:33.152179 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-9dsmn"] Feb 17 00:17:33 crc kubenswrapper[4791]: I0217 00:17:33.396776 4791 generic.go:334] "Generic (PLEG): container finished" podID="d8695c88-2448-4593-8029-3ce49d07ca00" containerID="b93839fd038cc9cd6db52810ac7b3d237a1960eebdd63f2c5391188ab51bd3e8" exitCode=0 Feb 17 00:17:33 crc kubenswrapper[4791]: I0217 00:17:33.396856 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"d8695c88-2448-4593-8029-3ce49d07ca00","Type":"ContainerDied","Data":"b93839fd038cc9cd6db52810ac7b3d237a1960eebdd63f2c5391188ab51bd3e8"} Feb 17 00:17:33 crc kubenswrapper[4791]: I0217 00:17:33.398393 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-9dsmn" event={"ID":"bf759390-4034-42c9-811b-531aeabd3ed6","Type":"ContainerStarted","Data":"eeceee21040b1bf28580fb56818482346ebf369b17eeb2321be2018b1d2a00c8"} Feb 17 00:17:33 crc kubenswrapper[4791]: I0217 00:17:33.398439 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-9dsmn" event={"ID":"bf759390-4034-42c9-811b-531aeabd3ed6","Type":"ContainerStarted","Data":"1f8d652602d661e7a55fb3fe99a84922d0cad37c0465c53d81b792f11279f06d"} Feb 17 00:17:33 crc kubenswrapper[4791]: I0217 00:17:33.446400 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-9dsmn" podStartSLOduration=4.446379352 podStartE2EDuration="4.446379352s" podCreationTimestamp="2026-02-17 00:17:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:17:33.443902783 +0000 UTC m=+710.923415320" watchObservedRunningTime="2026-02-17 00:17:33.446379352 +0000 UTC m=+710.925891889" Feb 17 00:17:34 crc kubenswrapper[4791]: I0217 00:17:34.407460 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"d8695c88-2448-4593-8029-3ce49d07ca00","Type":"ContainerStarted","Data":"9ae2c8e111457326ead8306b8de246dbef63976b944026f31b972d279ea4fd74"} Feb 17 00:17:34 crc kubenswrapper[4791]: I0217 00:17:34.410210 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"335ade17-e7c1-487c-9e12-ad3d0d3610b0","Type":"ContainerStarted","Data":"c4d4033b577bd6d027c697301176c69c0a46b27d66a578e3f38c80be97b97a25"} Feb 17 00:17:34 crc kubenswrapper[4791]: I0217 00:17:34.443013 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-1-build" podStartSLOduration=2.754913236 podStartE2EDuration="9.442988671s" podCreationTimestamp="2026-02-17 00:17:25 +0000 UTC" firstStartedPulling="2026-02-17 00:17:26.094270829 +0000 UTC m=+703.573783376" lastFinishedPulling="2026-02-17 00:17:32.782346244 +0000 UTC m=+710.261858811" observedRunningTime="2026-02-17 00:17:34.439925615 +0000 UTC m=+711.919438152" watchObservedRunningTime="2026-02-17 00:17:34.442988671 +0000 UTC m=+711.922501238" Feb 17 00:17:35 crc kubenswrapper[4791]: I0217 00:17:35.557846 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Feb 17 00:17:36 crc kubenswrapper[4791]: I0217 00:17:36.436084 4791 generic.go:334] "Generic (PLEG): container finished" podID="335ade17-e7c1-487c-9e12-ad3d0d3610b0" containerID="c4d4033b577bd6d027c697301176c69c0a46b27d66a578e3f38c80be97b97a25" exitCode=0 Feb 17 00:17:36 crc kubenswrapper[4791]: I0217 00:17:36.436196 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"335ade17-e7c1-487c-9e12-ad3d0d3610b0","Type":"ContainerDied","Data":"c4d4033b577bd6d027c697301176c69c0a46b27d66a578e3f38c80be97b97a25"} Feb 17 00:17:36 crc kubenswrapper[4791]: I0217 00:17:36.436904 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-operator-1-build" podUID="d8695c88-2448-4593-8029-3ce49d07ca00" containerName="docker-build" containerID="cri-o://9ae2c8e111457326ead8306b8de246dbef63976b944026f31b972d279ea4fd74" gracePeriod=30 Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.259848 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.261727 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.264878 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-ca" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.265574 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-global-ca" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.265869 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-sys-config" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.288995 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.453800 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.453881 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.453931 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.454024 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-builder-dockercfg-kmssz-push\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.454240 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.454333 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.454371 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.454412 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl8nj\" (UniqueName: \"kubernetes.io/projected/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-kube-api-access-gl8nj\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.454530 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.454590 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-builder-dockercfg-kmssz-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.454783 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.454857 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.556173 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.556260 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.556327 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.556362 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.556433 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.556367 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.556472 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-builder-dockercfg-kmssz-push\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.556532 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.556579 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.556616 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.556655 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gl8nj\" (UniqueName: \"kubernetes.io/projected/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-kube-api-access-gl8nj\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.556693 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.556725 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-builder-dockercfg-kmssz-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.556865 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.556888 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.557246 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.557426 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.557605 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.557796 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.558239 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.558566 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.562308 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-builder-dockercfg-kmssz-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.562449 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-builder-dockercfg-kmssz-push\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.573569 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl8nj\" (UniqueName: \"kubernetes.io/projected/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-kube-api-access-gl8nj\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.580409 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:40 crc kubenswrapper[4791]: W0217 00:17:40.534054 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a3e8ccc_76aa_44db_b5bb_f4043e185f4f.slice/crio-000d39e60be5101bdcfddeeab74e9e6f53543b1e253e3e0112987c261000a9ce WatchSource:0}: Error finding container 000d39e60be5101bdcfddeeab74e9e6f53543b1e253e3e0112987c261000a9ce: Status 404 returned error can't find the container with id 000d39e60be5101bdcfddeeab74e9e6f53543b1e253e3e0112987c261000a9ce Feb 17 00:17:40 crc kubenswrapper[4791]: I0217 00:17:40.539083 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Feb 17 00:17:41 crc kubenswrapper[4791]: I0217 00:17:41.478566 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f","Type":"ContainerStarted","Data":"000d39e60be5101bdcfddeeab74e9e6f53543b1e253e3e0112987c261000a9ce"} Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.056442 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_d8695c88-2448-4593-8029-3ce49d07ca00/docker-build/0.log" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.058012 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.243823 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kd6ln\" (UniqueName: \"kubernetes.io/projected/d8695c88-2448-4593-8029-3ce49d07ca00-kube-api-access-kd6ln\") pod \"d8695c88-2448-4593-8029-3ce49d07ca00\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.244562 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d8695c88-2448-4593-8029-3ce49d07ca00-buildworkdir\") pod \"d8695c88-2448-4593-8029-3ce49d07ca00\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.244662 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d8695c88-2448-4593-8029-3ce49d07ca00-build-system-configs\") pod \"d8695c88-2448-4593-8029-3ce49d07ca00\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.244712 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8695c88-2448-4593-8029-3ce49d07ca00-build-ca-bundles\") pod \"d8695c88-2448-4593-8029-3ce49d07ca00\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.244833 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d8695c88-2448-4593-8029-3ce49d07ca00-build-blob-cache\") pod \"d8695c88-2448-4593-8029-3ce49d07ca00\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.244899 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/d8695c88-2448-4593-8029-3ce49d07ca00-builder-dockercfg-kmssz-push\") pod \"d8695c88-2448-4593-8029-3ce49d07ca00\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.244967 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d8695c88-2448-4593-8029-3ce49d07ca00-container-storage-run\") pod \"d8695c88-2448-4593-8029-3ce49d07ca00\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.245037 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/d8695c88-2448-4593-8029-3ce49d07ca00-builder-dockercfg-kmssz-pull\") pod \"d8695c88-2448-4593-8029-3ce49d07ca00\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.245154 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d8695c88-2448-4593-8029-3ce49d07ca00-container-storage-root\") pod \"d8695c88-2448-4593-8029-3ce49d07ca00\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.245215 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d8695c88-2448-4593-8029-3ce49d07ca00-node-pullsecrets\") pod \"d8695c88-2448-4593-8029-3ce49d07ca00\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.245254 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d8695c88-2448-4593-8029-3ce49d07ca00-buildcachedir\") pod \"d8695c88-2448-4593-8029-3ce49d07ca00\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.245306 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8695c88-2448-4593-8029-3ce49d07ca00-build-proxy-ca-bundles\") pod \"d8695c88-2448-4593-8029-3ce49d07ca00\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.245398 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8695c88-2448-4593-8029-3ce49d07ca00-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "d8695c88-2448-4593-8029-3ce49d07ca00" (UID: "d8695c88-2448-4593-8029-3ce49d07ca00"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.245484 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d8695c88-2448-4593-8029-3ce49d07ca00-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "d8695c88-2448-4593-8029-3ce49d07ca00" (UID: "d8695c88-2448-4593-8029-3ce49d07ca00"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.245580 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d8695c88-2448-4593-8029-3ce49d07ca00-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "d8695c88-2448-4593-8029-3ce49d07ca00" (UID: "d8695c88-2448-4593-8029-3ce49d07ca00"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.245521 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8695c88-2448-4593-8029-3ce49d07ca00-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "d8695c88-2448-4593-8029-3ce49d07ca00" (UID: "d8695c88-2448-4593-8029-3ce49d07ca00"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.245769 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8695c88-2448-4593-8029-3ce49d07ca00-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "d8695c88-2448-4593-8029-3ce49d07ca00" (UID: "d8695c88-2448-4593-8029-3ce49d07ca00"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.246229 4791 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d8695c88-2448-4593-8029-3ce49d07ca00-buildcachedir\") on node \"crc\" DevicePath \"\"" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.246270 4791 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d8695c88-2448-4593-8029-3ce49d07ca00-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.246298 4791 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d8695c88-2448-4593-8029-3ce49d07ca00-buildworkdir\") on node \"crc\" DevicePath \"\"" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.246322 4791 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d8695c88-2448-4593-8029-3ce49d07ca00-build-system-configs\") on node \"crc\" DevicePath \"\"" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.246350 4791 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d8695c88-2448-4593-8029-3ce49d07ca00-build-blob-cache\") on node \"crc\" DevicePath \"\"" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.246740 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8695c88-2448-4593-8029-3ce49d07ca00-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "d8695c88-2448-4593-8029-3ce49d07ca00" (UID: "d8695c88-2448-4593-8029-3ce49d07ca00"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.247100 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8695c88-2448-4593-8029-3ce49d07ca00-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "d8695c88-2448-4593-8029-3ce49d07ca00" (UID: "d8695c88-2448-4593-8029-3ce49d07ca00"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.247595 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8695c88-2448-4593-8029-3ce49d07ca00-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "d8695c88-2448-4593-8029-3ce49d07ca00" (UID: "d8695c88-2448-4593-8029-3ce49d07ca00"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.248096 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8695c88-2448-4593-8029-3ce49d07ca00-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "d8695c88-2448-4593-8029-3ce49d07ca00" (UID: "d8695c88-2448-4593-8029-3ce49d07ca00"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.251760 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8695c88-2448-4593-8029-3ce49d07ca00-builder-dockercfg-kmssz-push" (OuterVolumeSpecName: "builder-dockercfg-kmssz-push") pod "d8695c88-2448-4593-8029-3ce49d07ca00" (UID: "d8695c88-2448-4593-8029-3ce49d07ca00"). InnerVolumeSpecName "builder-dockercfg-kmssz-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.252495 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8695c88-2448-4593-8029-3ce49d07ca00-builder-dockercfg-kmssz-pull" (OuterVolumeSpecName: "builder-dockercfg-kmssz-pull") pod "d8695c88-2448-4593-8029-3ce49d07ca00" (UID: "d8695c88-2448-4593-8029-3ce49d07ca00"). InnerVolumeSpecName "builder-dockercfg-kmssz-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.252839 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8695c88-2448-4593-8029-3ce49d07ca00-kube-api-access-kd6ln" (OuterVolumeSpecName: "kube-api-access-kd6ln") pod "d8695c88-2448-4593-8029-3ce49d07ca00" (UID: "d8695c88-2448-4593-8029-3ce49d07ca00"). InnerVolumeSpecName "kube-api-access-kd6ln". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.348024 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/d8695c88-2448-4593-8029-3ce49d07ca00-builder-dockercfg-kmssz-push\") on node \"crc\" DevicePath \"\"" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.348094 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d8695c88-2448-4593-8029-3ce49d07ca00-container-storage-run\") on node \"crc\" DevicePath \"\"" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.348187 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/d8695c88-2448-4593-8029-3ce49d07ca00-builder-dockercfg-kmssz-pull\") on node \"crc\" DevicePath \"\"" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.348211 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d8695c88-2448-4593-8029-3ce49d07ca00-container-storage-root\") on node \"crc\" DevicePath \"\"" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.348229 4791 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8695c88-2448-4593-8029-3ce49d07ca00-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.348246 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kd6ln\" (UniqueName: \"kubernetes.io/projected/d8695c88-2448-4593-8029-3ce49d07ca00-kube-api-access-kd6ln\") on node \"crc\" DevicePath \"\"" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.348265 4791 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8695c88-2448-4593-8029-3ce49d07ca00-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.504803 4791 generic.go:334] "Generic (PLEG): container finished" podID="335ade17-e7c1-487c-9e12-ad3d0d3610b0" containerID="9e94e4637793dfe889be430349ee205bb9c145a8da231847d4d11b6bbde20812" exitCode=0 Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.504857 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"335ade17-e7c1-487c-9e12-ad3d0d3610b0","Type":"ContainerDied","Data":"9e94e4637793dfe889be430349ee205bb9c145a8da231847d4d11b6bbde20812"} Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.508484 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_d8695c88-2448-4593-8029-3ce49d07ca00/docker-build/0.log" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.508986 4791 generic.go:334] "Generic (PLEG): container finished" podID="d8695c88-2448-4593-8029-3ce49d07ca00" containerID="9ae2c8e111457326ead8306b8de246dbef63976b944026f31b972d279ea4fd74" exitCode=1 Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.509069 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"d8695c88-2448-4593-8029-3ce49d07ca00","Type":"ContainerDied","Data":"9ae2c8e111457326ead8306b8de246dbef63976b944026f31b972d279ea4fd74"} Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.509104 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"d8695c88-2448-4593-8029-3ce49d07ca00","Type":"ContainerDied","Data":"f8c7e3abade9be907d19dcea3944ab13d63c479cb28ae61ecc17366d326050c2"} Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.509177 4791 scope.go:117] "RemoveContainer" containerID="9ae2c8e111457326ead8306b8de246dbef63976b944026f31b972d279ea4fd74" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.509350 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.513098 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f","Type":"ContainerStarted","Data":"61805709983d6546ba064b2eea228c45d5e8020a686cfb542d2d233b9b75f0a7"} Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.588487 4791 scope.go:117] "RemoveContainer" containerID="b93839fd038cc9cd6db52810ac7b3d237a1960eebdd63f2c5391188ab51bd3e8" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.588835 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.596596 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.624208 4791 scope.go:117] "RemoveContainer" containerID="9ae2c8e111457326ead8306b8de246dbef63976b944026f31b972d279ea4fd74" Feb 17 00:17:44 crc kubenswrapper[4791]: E0217 00:17:44.624693 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ae2c8e111457326ead8306b8de246dbef63976b944026f31b972d279ea4fd74\": container with ID starting with 9ae2c8e111457326ead8306b8de246dbef63976b944026f31b972d279ea4fd74 not found: ID does not exist" containerID="9ae2c8e111457326ead8306b8de246dbef63976b944026f31b972d279ea4fd74" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.624727 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ae2c8e111457326ead8306b8de246dbef63976b944026f31b972d279ea4fd74"} err="failed to get container status \"9ae2c8e111457326ead8306b8de246dbef63976b944026f31b972d279ea4fd74\": rpc error: code = NotFound desc = could not find container \"9ae2c8e111457326ead8306b8de246dbef63976b944026f31b972d279ea4fd74\": container with ID starting with 9ae2c8e111457326ead8306b8de246dbef63976b944026f31b972d279ea4fd74 not found: ID does not exist" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.624746 4791 scope.go:117] "RemoveContainer" containerID="b93839fd038cc9cd6db52810ac7b3d237a1960eebdd63f2c5391188ab51bd3e8" Feb 17 00:17:44 crc kubenswrapper[4791]: E0217 00:17:44.625168 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b93839fd038cc9cd6db52810ac7b3d237a1960eebdd63f2c5391188ab51bd3e8\": container with ID starting with b93839fd038cc9cd6db52810ac7b3d237a1960eebdd63f2c5391188ab51bd3e8 not found: ID does not exist" containerID="b93839fd038cc9cd6db52810ac7b3d237a1960eebdd63f2c5391188ab51bd3e8" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.625190 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b93839fd038cc9cd6db52810ac7b3d237a1960eebdd63f2c5391188ab51bd3e8"} err="failed to get container status \"b93839fd038cc9cd6db52810ac7b3d237a1960eebdd63f2c5391188ab51bd3e8\": rpc error: code = NotFound desc = could not find container \"b93839fd038cc9cd6db52810ac7b3d237a1960eebdd63f2c5391188ab51bd3e8\": container with ID starting with b93839fd038cc9cd6db52810ac7b3d237a1960eebdd63f2c5391188ab51bd3e8 not found: ID does not exist" Feb 17 00:17:44 crc kubenswrapper[4791]: E0217 00:17:44.669727 4791 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5079511847538378697, SKID=, AKID=AD:CC:45:B9:DF:E8:B7:57:BC:FE:80:6F:F9:92:EA:D4:BA:33:46:63 failed: x509: certificate signed by unknown authority" Feb 17 00:17:45 crc kubenswrapper[4791]: I0217 00:17:45.233945 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8695c88-2448-4593-8029-3ce49d07ca00" path="/var/lib/kubelet/pods/d8695c88-2448-4593-8029-3ce49d07ca00/volumes" Feb 17 00:17:45 crc kubenswrapper[4791]: I0217 00:17:45.523624 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"335ade17-e7c1-487c-9e12-ad3d0d3610b0","Type":"ContainerStarted","Data":"f99d0ba12b6f516ce592793e9ba34f5ae8f85a650bd90b8412798ee4cdcb4c0e"} Feb 17 00:17:45 crc kubenswrapper[4791]: I0217 00:17:45.524196 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:17:45 crc kubenswrapper[4791]: I0217 00:17:45.574420 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elasticsearch-es-default-0" podStartSLOduration=13.329613495 podStartE2EDuration="47.574403286s" podCreationTimestamp="2026-02-17 00:16:58 +0000 UTC" firstStartedPulling="2026-02-17 00:16:59.020677557 +0000 UTC m=+676.500190084" lastFinishedPulling="2026-02-17 00:17:33.265467348 +0000 UTC m=+710.744979875" observedRunningTime="2026-02-17 00:17:45.573735645 +0000 UTC m=+723.053248172" watchObservedRunningTime="2026-02-17 00:17:45.574403286 +0000 UTC m=+723.053915813" Feb 17 00:17:45 crc kubenswrapper[4791]: I0217 00:17:45.705263 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Feb 17 00:17:46 crc kubenswrapper[4791]: I0217 00:17:46.529959 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-operator-2-build" podUID="6a3e8ccc-76aa-44db-b5bb-f4043e185f4f" containerName="git-clone" containerID="cri-o://61805709983d6546ba064b2eea228c45d5e8020a686cfb542d2d233b9b75f0a7" gracePeriod=30 Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.007663 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-2-build_6a3e8ccc-76aa-44db-b5bb-f4043e185f4f/git-clone/0.log" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.008003 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.099267 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-build-ca-bundles\") pod \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.099560 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gl8nj\" (UniqueName: \"kubernetes.io/projected/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-kube-api-access-gl8nj\") pod \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.099644 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-buildcachedir\") pod \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.099714 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-builder-dockercfg-kmssz-pull\") pod \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.099820 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-container-storage-root\") pod \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.099918 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-builder-dockercfg-kmssz-push\") pod \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.100018 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-container-storage-run\") pod \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.100182 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-build-blob-cache\") pod \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.100261 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-build-proxy-ca-bundles\") pod \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.099746 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "6a3e8ccc-76aa-44db-b5bb-f4043e185f4f" (UID: "6a3e8ccc-76aa-44db-b5bb-f4043e185f4f"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.100423 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-buildworkdir\") pod \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.100500 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-build-system-configs\") pod \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.100565 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-node-pullsecrets\") pod \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.100645 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "6a3e8ccc-76aa-44db-b5bb-f4043e185f4f" (UID: "6a3e8ccc-76aa-44db-b5bb-f4043e185f4f"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.100883 4791 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.100943 4791 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-buildcachedir\") on node \"crc\" DevicePath \"\"" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.107603 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "6a3e8ccc-76aa-44db-b5bb-f4043e185f4f" (UID: "6a3e8ccc-76aa-44db-b5bb-f4043e185f4f"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.107630 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "6a3e8ccc-76aa-44db-b5bb-f4043e185f4f" (UID: "6a3e8ccc-76aa-44db-b5bb-f4043e185f4f"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.109758 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "6a3e8ccc-76aa-44db-b5bb-f4043e185f4f" (UID: "6a3e8ccc-76aa-44db-b5bb-f4043e185f4f"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.109847 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "6a3e8ccc-76aa-44db-b5bb-f4043e185f4f" (UID: "6a3e8ccc-76aa-44db-b5bb-f4043e185f4f"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.111627 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "6a3e8ccc-76aa-44db-b5bb-f4043e185f4f" (UID: "6a3e8ccc-76aa-44db-b5bb-f4043e185f4f"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.111726 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "6a3e8ccc-76aa-44db-b5bb-f4043e185f4f" (UID: "6a3e8ccc-76aa-44db-b5bb-f4043e185f4f"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.111811 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "6a3e8ccc-76aa-44db-b5bb-f4043e185f4f" (UID: "6a3e8ccc-76aa-44db-b5bb-f4043e185f4f"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.112251 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-builder-dockercfg-kmssz-push" (OuterVolumeSpecName: "builder-dockercfg-kmssz-push") pod "6a3e8ccc-76aa-44db-b5bb-f4043e185f4f" (UID: "6a3e8ccc-76aa-44db-b5bb-f4043e185f4f"). InnerVolumeSpecName "builder-dockercfg-kmssz-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.113779 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-kube-api-access-gl8nj" (OuterVolumeSpecName: "kube-api-access-gl8nj") pod "6a3e8ccc-76aa-44db-b5bb-f4043e185f4f" (UID: "6a3e8ccc-76aa-44db-b5bb-f4043e185f4f"). InnerVolumeSpecName "kube-api-access-gl8nj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.117121 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-builder-dockercfg-kmssz-pull" (OuterVolumeSpecName: "builder-dockercfg-kmssz-pull") pod "6a3e8ccc-76aa-44db-b5bb-f4043e185f4f" (UID: "6a3e8ccc-76aa-44db-b5bb-f4043e185f4f"). InnerVolumeSpecName "builder-dockercfg-kmssz-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.201512 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-container-storage-root\") on node \"crc\" DevicePath \"\"" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.201755 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-container-storage-run\") on node \"crc\" DevicePath \"\"" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.201837 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-builder-dockercfg-kmssz-push\") on node \"crc\" DevicePath \"\"" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.201915 4791 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-build-blob-cache\") on node \"crc\" DevicePath \"\"" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.202019 4791 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.202095 4791 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-buildworkdir\") on node \"crc\" DevicePath \"\"" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.202190 4791 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-build-system-configs\") on node \"crc\" DevicePath \"\"" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.202263 4791 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.202343 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gl8nj\" (UniqueName: \"kubernetes.io/projected/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-kube-api-access-gl8nj\") on node \"crc\" DevicePath \"\"" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.202407 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-builder-dockercfg-kmssz-pull\") on node \"crc\" DevicePath \"\"" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.536085 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-2-build_6a3e8ccc-76aa-44db-b5bb-f4043e185f4f/git-clone/0.log" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.536152 4791 generic.go:334] "Generic (PLEG): container finished" podID="6a3e8ccc-76aa-44db-b5bb-f4043e185f4f" containerID="61805709983d6546ba064b2eea228c45d5e8020a686cfb542d2d233b9b75f0a7" exitCode=1 Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.536185 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f","Type":"ContainerDied","Data":"61805709983d6546ba064b2eea228c45d5e8020a686cfb542d2d233b9b75f0a7"} Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.536218 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f","Type":"ContainerDied","Data":"000d39e60be5101bdcfddeeab74e9e6f53543b1e253e3e0112987c261000a9ce"} Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.536228 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.536237 4791 scope.go:117] "RemoveContainer" containerID="61805709983d6546ba064b2eea228c45d5e8020a686cfb542d2d233b9b75f0a7" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.558680 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.559059 4791 scope.go:117] "RemoveContainer" containerID="61805709983d6546ba064b2eea228c45d5e8020a686cfb542d2d233b9b75f0a7" Feb 17 00:17:47 crc kubenswrapper[4791]: E0217 00:17:47.559747 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61805709983d6546ba064b2eea228c45d5e8020a686cfb542d2d233b9b75f0a7\": container with ID starting with 61805709983d6546ba064b2eea228c45d5e8020a686cfb542d2d233b9b75f0a7 not found: ID does not exist" containerID="61805709983d6546ba064b2eea228c45d5e8020a686cfb542d2d233b9b75f0a7" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.559776 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61805709983d6546ba064b2eea228c45d5e8020a686cfb542d2d233b9b75f0a7"} err="failed to get container status \"61805709983d6546ba064b2eea228c45d5e8020a686cfb542d2d233b9b75f0a7\": rpc error: code = NotFound desc = could not find container \"61805709983d6546ba064b2eea228c45d5e8020a686cfb542d2d233b9b75f0a7\": container with ID starting with 61805709983d6546ba064b2eea228c45d5e8020a686cfb542d2d233b9b75f0a7 not found: ID does not exist" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.564753 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Feb 17 00:17:49 crc kubenswrapper[4791]: I0217 00:17:49.228729 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a3e8ccc-76aa-44db-b5bb-f4043e185f4f" path="/var/lib/kubelet/pods/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f/volumes" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.115220 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-3-build"] Feb 17 00:17:57 crc kubenswrapper[4791]: E0217 00:17:57.118050 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a3e8ccc-76aa-44db-b5bb-f4043e185f4f" containerName="git-clone" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.118308 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a3e8ccc-76aa-44db-b5bb-f4043e185f4f" containerName="git-clone" Feb 17 00:17:57 crc kubenswrapper[4791]: E0217 00:17:57.118476 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8695c88-2448-4593-8029-3ce49d07ca00" containerName="docker-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.118632 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8695c88-2448-4593-8029-3ce49d07ca00" containerName="docker-build" Feb 17 00:17:57 crc kubenswrapper[4791]: E0217 00:17:57.118799 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8695c88-2448-4593-8029-3ce49d07ca00" containerName="manage-dockerfile" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.118953 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8695c88-2448-4593-8029-3ce49d07ca00" containerName="manage-dockerfile" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.119385 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a3e8ccc-76aa-44db-b5bb-f4043e185f4f" containerName="git-clone" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.119584 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8695c88-2448-4593-8029-3ce49d07ca00" containerName="docker-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.121676 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.126434 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-3-sys-config" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.127005 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-3-global-ca" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.127473 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-3-ca" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.130259 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-kmssz" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.140803 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/295671d5-4684-438c-8761-4a5d0eb6a9c5-build-system-configs\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.140847 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/295671d5-4684-438c-8761-4a5d0eb6a9c5-buildworkdir\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.140889 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/295671d5-4684-438c-8761-4a5d0eb6a9c5-builder-dockercfg-kmssz-pull\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.140958 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/295671d5-4684-438c-8761-4a5d0eb6a9c5-container-storage-run\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.140998 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/295671d5-4684-438c-8761-4a5d0eb6a9c5-builder-dockercfg-kmssz-push\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.141012 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/295671d5-4684-438c-8761-4a5d0eb6a9c5-build-ca-bundles\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.141040 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp9mn\" (UniqueName: \"kubernetes.io/projected/295671d5-4684-438c-8761-4a5d0eb6a9c5-kube-api-access-rp9mn\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.141072 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/295671d5-4684-438c-8761-4a5d0eb6a9c5-container-storage-root\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.141092 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/295671d5-4684-438c-8761-4a5d0eb6a9c5-buildcachedir\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.141133 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/295671d5-4684-438c-8761-4a5d0eb6a9c5-node-pullsecrets\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.141158 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/295671d5-4684-438c-8761-4a5d0eb6a9c5-build-proxy-ca-bundles\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.141172 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/295671d5-4684-438c-8761-4a5d0eb6a9c5-build-blob-cache\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.148737 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-3-build"] Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.241917 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/295671d5-4684-438c-8761-4a5d0eb6a9c5-container-storage-run\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.242637 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/295671d5-4684-438c-8761-4a5d0eb6a9c5-container-storage-run\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.242890 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/295671d5-4684-438c-8761-4a5d0eb6a9c5-builder-dockercfg-kmssz-push\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.243829 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/295671d5-4684-438c-8761-4a5d0eb6a9c5-build-ca-bundles\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.243978 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp9mn\" (UniqueName: \"kubernetes.io/projected/295671d5-4684-438c-8761-4a5d0eb6a9c5-kube-api-access-rp9mn\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.244437 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/295671d5-4684-438c-8761-4a5d0eb6a9c5-container-storage-root\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.244625 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/295671d5-4684-438c-8761-4a5d0eb6a9c5-buildcachedir\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.244747 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/295671d5-4684-438c-8761-4a5d0eb6a9c5-node-pullsecrets\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.244846 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/295671d5-4684-438c-8761-4a5d0eb6a9c5-build-proxy-ca-bundles\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.244944 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/295671d5-4684-438c-8761-4a5d0eb6a9c5-build-blob-cache\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.245045 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/295671d5-4684-438c-8761-4a5d0eb6a9c5-buildcachedir\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.245164 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/295671d5-4684-438c-8761-4a5d0eb6a9c5-build-system-configs\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.245275 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/295671d5-4684-438c-8761-4a5d0eb6a9c5-buildworkdir\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.245405 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/295671d5-4684-438c-8761-4a5d0eb6a9c5-builder-dockercfg-kmssz-pull\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.245626 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/295671d5-4684-438c-8761-4a5d0eb6a9c5-build-blob-cache\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.245470 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/295671d5-4684-438c-8761-4a5d0eb6a9c5-node-pullsecrets\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.245923 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/295671d5-4684-438c-8761-4a5d0eb6a9c5-build-system-configs\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.244996 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/295671d5-4684-438c-8761-4a5d0eb6a9c5-build-ca-bundles\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.246267 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/295671d5-4684-438c-8761-4a5d0eb6a9c5-build-proxy-ca-bundles\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.245420 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/295671d5-4684-438c-8761-4a5d0eb6a9c5-container-storage-root\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.247276 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/295671d5-4684-438c-8761-4a5d0eb6a9c5-buildworkdir\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.260985 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/295671d5-4684-438c-8761-4a5d0eb6a9c5-builder-dockercfg-kmssz-push\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.260985 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/295671d5-4684-438c-8761-4a5d0eb6a9c5-builder-dockercfg-kmssz-pull\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.265169 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp9mn\" (UniqueName: \"kubernetes.io/projected/295671d5-4684-438c-8761-4a5d0eb6a9c5-kube-api-access-rp9mn\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.440998 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.885520 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-3-build"] Feb 17 00:17:57 crc kubenswrapper[4791]: W0217 00:17:57.893619 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod295671d5_4684_438c_8761_4a5d0eb6a9c5.slice/crio-315ebbc182138ebf30ba7132dabab3344781917088dda50717dd8353292508b8 WatchSource:0}: Error finding container 315ebbc182138ebf30ba7132dabab3344781917088dda50717dd8353292508b8: Status 404 returned error can't find the container with id 315ebbc182138ebf30ba7132dabab3344781917088dda50717dd8353292508b8 Feb 17 00:17:58 crc kubenswrapper[4791]: I0217 00:17:58.618511 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-3-build" event={"ID":"295671d5-4684-438c-8761-4a5d0eb6a9c5","Type":"ContainerStarted","Data":"e48b696e6befdef017dc2c92a69c56769e56ed142e271411c3e28861708b11f9"} Feb 17 00:17:58 crc kubenswrapper[4791]: I0217 00:17:58.618841 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-3-build" event={"ID":"295671d5-4684-438c-8761-4a5d0eb6a9c5","Type":"ContainerStarted","Data":"315ebbc182138ebf30ba7132dabab3344781917088dda50717dd8353292508b8"} Feb 17 00:17:58 crc kubenswrapper[4791]: E0217 00:17:58.679670 4791 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5079511847538378697, SKID=, AKID=AD:CC:45:B9:DF:E8:B7:57:BC:FE:80:6F:F9:92:EA:D4:BA:33:46:63 failed: x509: certificate signed by unknown authority" Feb 17 00:17:59 crc kubenswrapper[4791]: I0217 00:17:59.374858 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:17:59 crc kubenswrapper[4791]: I0217 00:17:59.722300 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-3-build"] Feb 17 00:18:00 crc kubenswrapper[4791]: I0217 00:18:00.633606 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-operator-3-build" podUID="295671d5-4684-438c-8761-4a5d0eb6a9c5" containerName="git-clone" containerID="cri-o://e48b696e6befdef017dc2c92a69c56769e56ed142e271411c3e28861708b11f9" gracePeriod=30 Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.016055 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-3-build_295671d5-4684-438c-8761-4a5d0eb6a9c5/git-clone/0.log" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.016347 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.112682 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/295671d5-4684-438c-8761-4a5d0eb6a9c5-builder-dockercfg-kmssz-push\") pod \"295671d5-4684-438c-8761-4a5d0eb6a9c5\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.112736 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/295671d5-4684-438c-8761-4a5d0eb6a9c5-buildcachedir\") pod \"295671d5-4684-438c-8761-4a5d0eb6a9c5\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.112770 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/295671d5-4684-438c-8761-4a5d0eb6a9c5-build-system-configs\") pod \"295671d5-4684-438c-8761-4a5d0eb6a9c5\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.112815 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/295671d5-4684-438c-8761-4a5d0eb6a9c5-container-storage-root\") pod \"295671d5-4684-438c-8761-4a5d0eb6a9c5\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.112862 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rp9mn\" (UniqueName: \"kubernetes.io/projected/295671d5-4684-438c-8761-4a5d0eb6a9c5-kube-api-access-rp9mn\") pod \"295671d5-4684-438c-8761-4a5d0eb6a9c5\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.113382 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/295671d5-4684-438c-8761-4a5d0eb6a9c5-container-storage-run\") pod \"295671d5-4684-438c-8761-4a5d0eb6a9c5\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.112863 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/295671d5-4684-438c-8761-4a5d0eb6a9c5-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "295671d5-4684-438c-8761-4a5d0eb6a9c5" (UID: "295671d5-4684-438c-8761-4a5d0eb6a9c5"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.113375 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/295671d5-4684-438c-8761-4a5d0eb6a9c5-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "295671d5-4684-438c-8761-4a5d0eb6a9c5" (UID: "295671d5-4684-438c-8761-4a5d0eb6a9c5"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.113421 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/295671d5-4684-438c-8761-4a5d0eb6a9c5-build-blob-cache\") pod \"295671d5-4684-438c-8761-4a5d0eb6a9c5\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.113460 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/295671d5-4684-438c-8761-4a5d0eb6a9c5-builder-dockercfg-kmssz-pull\") pod \"295671d5-4684-438c-8761-4a5d0eb6a9c5\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.113450 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/295671d5-4684-438c-8761-4a5d0eb6a9c5-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "295671d5-4684-438c-8761-4a5d0eb6a9c5" (UID: "295671d5-4684-438c-8761-4a5d0eb6a9c5"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.113510 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/295671d5-4684-438c-8761-4a5d0eb6a9c5-build-proxy-ca-bundles\") pod \"295671d5-4684-438c-8761-4a5d0eb6a9c5\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.113556 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/295671d5-4684-438c-8761-4a5d0eb6a9c5-build-ca-bundles\") pod \"295671d5-4684-438c-8761-4a5d0eb6a9c5\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.113594 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/295671d5-4684-438c-8761-4a5d0eb6a9c5-node-pullsecrets\") pod \"295671d5-4684-438c-8761-4a5d0eb6a9c5\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.113631 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/295671d5-4684-438c-8761-4a5d0eb6a9c5-buildworkdir\") pod \"295671d5-4684-438c-8761-4a5d0eb6a9c5\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.113747 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/295671d5-4684-438c-8761-4a5d0eb6a9c5-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "295671d5-4684-438c-8761-4a5d0eb6a9c5" (UID: "295671d5-4684-438c-8761-4a5d0eb6a9c5"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.113770 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/295671d5-4684-438c-8761-4a5d0eb6a9c5-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "295671d5-4684-438c-8761-4a5d0eb6a9c5" (UID: "295671d5-4684-438c-8761-4a5d0eb6a9c5"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.113836 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/295671d5-4684-438c-8761-4a5d0eb6a9c5-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "295671d5-4684-438c-8761-4a5d0eb6a9c5" (UID: "295671d5-4684-438c-8761-4a5d0eb6a9c5"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.114130 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/295671d5-4684-438c-8761-4a5d0eb6a9c5-container-storage-run\") on node \"crc\" DevicePath \"\"" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.114150 4791 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/295671d5-4684-438c-8761-4a5d0eb6a9c5-build-blob-cache\") on node \"crc\" DevicePath \"\"" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.114158 4791 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/295671d5-4684-438c-8761-4a5d0eb6a9c5-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.114167 4791 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/295671d5-4684-438c-8761-4a5d0eb6a9c5-buildcachedir\") on node \"crc\" DevicePath \"\"" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.114175 4791 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/295671d5-4684-438c-8761-4a5d0eb6a9c5-build-system-configs\") on node \"crc\" DevicePath \"\"" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.114184 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/295671d5-4684-438c-8761-4a5d0eb6a9c5-container-storage-root\") on node \"crc\" DevicePath \"\"" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.114214 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/295671d5-4684-438c-8761-4a5d0eb6a9c5-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "295671d5-4684-438c-8761-4a5d0eb6a9c5" (UID: "295671d5-4684-438c-8761-4a5d0eb6a9c5"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.114256 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/295671d5-4684-438c-8761-4a5d0eb6a9c5-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "295671d5-4684-438c-8761-4a5d0eb6a9c5" (UID: "295671d5-4684-438c-8761-4a5d0eb6a9c5"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.114434 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/295671d5-4684-438c-8761-4a5d0eb6a9c5-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "295671d5-4684-438c-8761-4a5d0eb6a9c5" (UID: "295671d5-4684-438c-8761-4a5d0eb6a9c5"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.120297 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/295671d5-4684-438c-8761-4a5d0eb6a9c5-builder-dockercfg-kmssz-pull" (OuterVolumeSpecName: "builder-dockercfg-kmssz-pull") pod "295671d5-4684-438c-8761-4a5d0eb6a9c5" (UID: "295671d5-4684-438c-8761-4a5d0eb6a9c5"). InnerVolumeSpecName "builder-dockercfg-kmssz-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.120314 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/295671d5-4684-438c-8761-4a5d0eb6a9c5-kube-api-access-rp9mn" (OuterVolumeSpecName: "kube-api-access-rp9mn") pod "295671d5-4684-438c-8761-4a5d0eb6a9c5" (UID: "295671d5-4684-438c-8761-4a5d0eb6a9c5"). InnerVolumeSpecName "kube-api-access-rp9mn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.120332 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/295671d5-4684-438c-8761-4a5d0eb6a9c5-builder-dockercfg-kmssz-push" (OuterVolumeSpecName: "builder-dockercfg-kmssz-push") pod "295671d5-4684-438c-8761-4a5d0eb6a9c5" (UID: "295671d5-4684-438c-8761-4a5d0eb6a9c5"). InnerVolumeSpecName "builder-dockercfg-kmssz-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.215422 4791 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/295671d5-4684-438c-8761-4a5d0eb6a9c5-buildworkdir\") on node \"crc\" DevicePath \"\"" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.215459 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/295671d5-4684-438c-8761-4a5d0eb6a9c5-builder-dockercfg-kmssz-push\") on node \"crc\" DevicePath \"\"" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.215473 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rp9mn\" (UniqueName: \"kubernetes.io/projected/295671d5-4684-438c-8761-4a5d0eb6a9c5-kube-api-access-rp9mn\") on node \"crc\" DevicePath \"\"" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.215484 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/295671d5-4684-438c-8761-4a5d0eb6a9c5-builder-dockercfg-kmssz-pull\") on node \"crc\" DevicePath \"\"" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.215496 4791 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/295671d5-4684-438c-8761-4a5d0eb6a9c5-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.215507 4791 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/295671d5-4684-438c-8761-4a5d0eb6a9c5-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.643258 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-3-build_295671d5-4684-438c-8761-4a5d0eb6a9c5/git-clone/0.log" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.643364 4791 generic.go:334] "Generic (PLEG): container finished" podID="295671d5-4684-438c-8761-4a5d0eb6a9c5" containerID="e48b696e6befdef017dc2c92a69c56769e56ed142e271411c3e28861708b11f9" exitCode=1 Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.643405 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-3-build" event={"ID":"295671d5-4684-438c-8761-4a5d0eb6a9c5","Type":"ContainerDied","Data":"e48b696e6befdef017dc2c92a69c56769e56ed142e271411c3e28861708b11f9"} Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.643447 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-3-build" event={"ID":"295671d5-4684-438c-8761-4a5d0eb6a9c5","Type":"ContainerDied","Data":"315ebbc182138ebf30ba7132dabab3344781917088dda50717dd8353292508b8"} Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.643472 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.643477 4791 scope.go:117] "RemoveContainer" containerID="e48b696e6befdef017dc2c92a69c56769e56ed142e271411c3e28861708b11f9" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.673280 4791 scope.go:117] "RemoveContainer" containerID="e48b696e6befdef017dc2c92a69c56769e56ed142e271411c3e28861708b11f9" Feb 17 00:18:01 crc kubenswrapper[4791]: E0217 00:18:01.674144 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e48b696e6befdef017dc2c92a69c56769e56ed142e271411c3e28861708b11f9\": container with ID starting with e48b696e6befdef017dc2c92a69c56769e56ed142e271411c3e28861708b11f9 not found: ID does not exist" containerID="e48b696e6befdef017dc2c92a69c56769e56ed142e271411c3e28861708b11f9" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.674182 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e48b696e6befdef017dc2c92a69c56769e56ed142e271411c3e28861708b11f9"} err="failed to get container status \"e48b696e6befdef017dc2c92a69c56769e56ed142e271411c3e28861708b11f9\": rpc error: code = NotFound desc = could not find container \"e48b696e6befdef017dc2c92a69c56769e56ed142e271411c3e28861708b11f9\": container with ID starting with e48b696e6befdef017dc2c92a69c56769e56ed142e271411c3e28861708b11f9 not found: ID does not exist" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.676228 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-3-build"] Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.685863 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-3-build"] Feb 17 00:18:03 crc kubenswrapper[4791]: I0217 00:18:03.237664 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="295671d5-4684-438c-8761-4a5d0eb6a9c5" path="/var/lib/kubelet/pods/295671d5-4684-438c-8761-4a5d0eb6a9c5/volumes" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.180650 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-4-build"] Feb 17 00:18:11 crc kubenswrapper[4791]: E0217 00:18:11.181456 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="295671d5-4684-438c-8761-4a5d0eb6a9c5" containerName="git-clone" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.181471 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="295671d5-4684-438c-8761-4a5d0eb6a9c5" containerName="git-clone" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.181587 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="295671d5-4684-438c-8761-4a5d0eb6a9c5" containerName="git-clone" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.182427 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.184327 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-kmssz" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.184623 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-4-global-ca" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.184748 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-4-sys-config" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.184861 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-4-ca" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.203818 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-4-build"] Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.257283 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-builder-dockercfg-kmssz-push\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.257323 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbwcd\" (UniqueName: \"kubernetes.io/projected/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-kube-api-access-xbwcd\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.257344 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-build-ca-bundles\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.257363 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-buildcachedir\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.257406 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-buildworkdir\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.257433 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-build-proxy-ca-bundles\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.257490 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-builder-dockercfg-kmssz-pull\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.257519 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-build-blob-cache\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.257540 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-container-storage-root\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.257568 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-container-storage-run\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.257603 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-build-system-configs\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.257624 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-node-pullsecrets\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.358437 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-container-storage-run\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.358555 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-build-system-configs\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.358622 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-node-pullsecrets\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.358691 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-builder-dockercfg-kmssz-push\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.358742 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbwcd\" (UniqueName: \"kubernetes.io/projected/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-kube-api-access-xbwcd\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.358792 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-build-ca-bundles\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.358842 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-buildcachedir\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.358936 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-buildworkdir\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.358947 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-node-pullsecrets\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.358990 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-build-proxy-ca-bundles\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.359184 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-build-system-configs\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.359184 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-container-storage-run\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.359207 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-builder-dockercfg-kmssz-pull\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.359281 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-buildcachedir\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.359297 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-container-storage-root\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.359930 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-buildworkdir\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.359961 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-container-storage-root\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.360001 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-build-ca-bundles\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.360162 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-build-proxy-ca-bundles\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.360350 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-build-blob-cache\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.360828 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-build-blob-cache\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.365789 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-builder-dockercfg-kmssz-pull\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.366770 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-builder-dockercfg-kmssz-push\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.386585 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbwcd\" (UniqueName: \"kubernetes.io/projected/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-kube-api-access-xbwcd\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.510268 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.844272 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-4-build"] Feb 17 00:18:12 crc kubenswrapper[4791]: I0217 00:18:12.720688 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-4-build" event={"ID":"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8","Type":"ContainerStarted","Data":"df64ec6c138fe5f8c7495a2359d20ea8bdedd5d71526fe5d6586efe9a636c57b"} Feb 17 00:18:12 crc kubenswrapper[4791]: I0217 00:18:12.721187 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-4-build" event={"ID":"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8","Type":"ContainerStarted","Data":"642c451b43608526730ae9299d52d48dc5d8c9d9ba22f3ab1c76d4a4069d373a"} Feb 17 00:18:12 crc kubenswrapper[4791]: E0217 00:18:12.803949 4791 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5079511847538378697, SKID=, AKID=AD:CC:45:B9:DF:E8:B7:57:BC:FE:80:6F:F9:92:EA:D4:BA:33:46:63 failed: x509: certificate signed by unknown authority" Feb 17 00:18:13 crc kubenswrapper[4791]: I0217 00:18:13.843582 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-4-build"] Feb 17 00:18:14 crc kubenswrapper[4791]: I0217 00:18:14.737653 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-operator-4-build" podUID="e01fe51a-e7fc-45d8-80f9-8a6c767d97f8" containerName="git-clone" containerID="cri-o://df64ec6c138fe5f8c7495a2359d20ea8bdedd5d71526fe5d6586efe9a636c57b" gracePeriod=30 Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.181825 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-4-build_e01fe51a-e7fc-45d8-80f9-8a6c767d97f8/git-clone/0.log" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.182379 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.317697 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-node-pullsecrets\") pod \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.317744 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-buildcachedir\") pod \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.317770 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbwcd\" (UniqueName: \"kubernetes.io/projected/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-kube-api-access-xbwcd\") pod \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.317825 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-buildworkdir\") pod \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.317869 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "e01fe51a-e7fc-45d8-80f9-8a6c767d97f8" (UID: "e01fe51a-e7fc-45d8-80f9-8a6c767d97f8"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.317899 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "e01fe51a-e7fc-45d8-80f9-8a6c767d97f8" (UID: "e01fe51a-e7fc-45d8-80f9-8a6c767d97f8"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.318322 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "e01fe51a-e7fc-45d8-80f9-8a6c767d97f8" (UID: "e01fe51a-e7fc-45d8-80f9-8a6c767d97f8"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.318943 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-build-proxy-ca-bundles\") pod \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.318994 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-build-ca-bundles\") pod \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.319054 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-container-storage-root\") pod \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.319080 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-build-blob-cache\") pod \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.319137 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-builder-dockercfg-kmssz-push\") pod \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.319166 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-builder-dockercfg-kmssz-pull\") pod \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.319190 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-container-storage-run\") pod \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.319231 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-build-system-configs\") pod \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.319435 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "e01fe51a-e7fc-45d8-80f9-8a6c767d97f8" (UID: "e01fe51a-e7fc-45d8-80f9-8a6c767d97f8"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.319675 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-container-storage-root\") on node \"crc\" DevicePath \"\"" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.319699 4791 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.319711 4791 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-buildcachedir\") on node \"crc\" DevicePath \"\"" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.319724 4791 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-buildworkdir\") on node \"crc\" DevicePath \"\"" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.319681 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "e01fe51a-e7fc-45d8-80f9-8a6c767d97f8" (UID: "e01fe51a-e7fc-45d8-80f9-8a6c767d97f8"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.319717 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "e01fe51a-e7fc-45d8-80f9-8a6c767d97f8" (UID: "e01fe51a-e7fc-45d8-80f9-8a6c767d97f8"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.319758 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "e01fe51a-e7fc-45d8-80f9-8a6c767d97f8" (UID: "e01fe51a-e7fc-45d8-80f9-8a6c767d97f8"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.319893 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "e01fe51a-e7fc-45d8-80f9-8a6c767d97f8" (UID: "e01fe51a-e7fc-45d8-80f9-8a6c767d97f8"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.320122 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "e01fe51a-e7fc-45d8-80f9-8a6c767d97f8" (UID: "e01fe51a-e7fc-45d8-80f9-8a6c767d97f8"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.325880 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-builder-dockercfg-kmssz-push" (OuterVolumeSpecName: "builder-dockercfg-kmssz-push") pod "e01fe51a-e7fc-45d8-80f9-8a6c767d97f8" (UID: "e01fe51a-e7fc-45d8-80f9-8a6c767d97f8"). InnerVolumeSpecName "builder-dockercfg-kmssz-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.326563 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-kube-api-access-xbwcd" (OuterVolumeSpecName: "kube-api-access-xbwcd") pod "e01fe51a-e7fc-45d8-80f9-8a6c767d97f8" (UID: "e01fe51a-e7fc-45d8-80f9-8a6c767d97f8"). InnerVolumeSpecName "kube-api-access-xbwcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.330605 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-builder-dockercfg-kmssz-pull" (OuterVolumeSpecName: "builder-dockercfg-kmssz-pull") pod "e01fe51a-e7fc-45d8-80f9-8a6c767d97f8" (UID: "e01fe51a-e7fc-45d8-80f9-8a6c767d97f8"). InnerVolumeSpecName "builder-dockercfg-kmssz-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.421154 4791 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.421209 4791 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.421229 4791 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-build-blob-cache\") on node \"crc\" DevicePath \"\"" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.421249 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-builder-dockercfg-kmssz-push\") on node \"crc\" DevicePath \"\"" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.421270 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-builder-dockercfg-kmssz-pull\") on node \"crc\" DevicePath \"\"" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.421289 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-container-storage-run\") on node \"crc\" DevicePath \"\"" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.421306 4791 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-build-system-configs\") on node \"crc\" DevicePath \"\"" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.421323 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbwcd\" (UniqueName: \"kubernetes.io/projected/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-kube-api-access-xbwcd\") on node \"crc\" DevicePath \"\"" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.747665 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-4-build_e01fe51a-e7fc-45d8-80f9-8a6c767d97f8/git-clone/0.log" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.747751 4791 generic.go:334] "Generic (PLEG): container finished" podID="e01fe51a-e7fc-45d8-80f9-8a6c767d97f8" containerID="df64ec6c138fe5f8c7495a2359d20ea8bdedd5d71526fe5d6586efe9a636c57b" exitCode=1 Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.747804 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-4-build" event={"ID":"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8","Type":"ContainerDied","Data":"df64ec6c138fe5f8c7495a2359d20ea8bdedd5d71526fe5d6586efe9a636c57b"} Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.747886 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-4-build" event={"ID":"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8","Type":"ContainerDied","Data":"642c451b43608526730ae9299d52d48dc5d8c9d9ba22f3ab1c76d4a4069d373a"} Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.747904 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.747923 4791 scope.go:117] "RemoveContainer" containerID="df64ec6c138fe5f8c7495a2359d20ea8bdedd5d71526fe5d6586efe9a636c57b" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.771954 4791 scope.go:117] "RemoveContainer" containerID="df64ec6c138fe5f8c7495a2359d20ea8bdedd5d71526fe5d6586efe9a636c57b" Feb 17 00:18:15 crc kubenswrapper[4791]: E0217 00:18:15.772545 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df64ec6c138fe5f8c7495a2359d20ea8bdedd5d71526fe5d6586efe9a636c57b\": container with ID starting with df64ec6c138fe5f8c7495a2359d20ea8bdedd5d71526fe5d6586efe9a636c57b not found: ID does not exist" containerID="df64ec6c138fe5f8c7495a2359d20ea8bdedd5d71526fe5d6586efe9a636c57b" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.772585 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df64ec6c138fe5f8c7495a2359d20ea8bdedd5d71526fe5d6586efe9a636c57b"} err="failed to get container status \"df64ec6c138fe5f8c7495a2359d20ea8bdedd5d71526fe5d6586efe9a636c57b\": rpc error: code = NotFound desc = could not find container \"df64ec6c138fe5f8c7495a2359d20ea8bdedd5d71526fe5d6586efe9a636c57b\": container with ID starting with df64ec6c138fe5f8c7495a2359d20ea8bdedd5d71526fe5d6586efe9a636c57b not found: ID does not exist" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.803620 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-4-build"] Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.828433 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-4-build"] Feb 17 00:18:16 crc kubenswrapper[4791]: I0217 00:18:16.623067 4791 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 17 00:18:17 crc kubenswrapper[4791]: I0217 00:18:17.234700 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e01fe51a-e7fc-45d8-80f9-8a6c767d97f8" path="/var/lib/kubelet/pods/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8/volumes" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.289609 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-5-build"] Feb 17 00:18:25 crc kubenswrapper[4791]: E0217 00:18:25.290655 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e01fe51a-e7fc-45d8-80f9-8a6c767d97f8" containerName="git-clone" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.290671 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="e01fe51a-e7fc-45d8-80f9-8a6c767d97f8" containerName="git-clone" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.290801 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="e01fe51a-e7fc-45d8-80f9-8a6c767d97f8" containerName="git-clone" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.291620 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.295085 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-kmssz" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.295122 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-5-global-ca" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.295691 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-5-sys-config" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.306241 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-5-ca" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.334831 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-5-build"] Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.468884 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f00d7586-332c-485f-b171-5b3f4f7a0728-buildworkdir\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.468968 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f00d7586-332c-485f-b171-5b3f4f7a0728-node-pullsecrets\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.469036 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f00d7586-332c-485f-b171-5b3f4f7a0728-build-proxy-ca-bundles\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.469100 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/f00d7586-332c-485f-b171-5b3f4f7a0728-builder-dockercfg-kmssz-pull\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.469217 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d889h\" (UniqueName: \"kubernetes.io/projected/f00d7586-332c-485f-b171-5b3f4f7a0728-kube-api-access-d889h\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.469379 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/f00d7586-332c-485f-b171-5b3f4f7a0728-builder-dockercfg-kmssz-push\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.469445 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f00d7586-332c-485f-b171-5b3f4f7a0728-container-storage-run\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.469495 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f00d7586-332c-485f-b171-5b3f4f7a0728-build-system-configs\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.469580 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f00d7586-332c-485f-b171-5b3f4f7a0728-build-ca-bundles\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.469791 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f00d7586-332c-485f-b171-5b3f4f7a0728-buildcachedir\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.469903 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f00d7586-332c-485f-b171-5b3f4f7a0728-build-blob-cache\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.469935 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f00d7586-332c-485f-b171-5b3f4f7a0728-container-storage-root\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.570852 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f00d7586-332c-485f-b171-5b3f4f7a0728-container-storage-run\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.570910 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f00d7586-332c-485f-b171-5b3f4f7a0728-build-system-configs\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.570929 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f00d7586-332c-485f-b171-5b3f4f7a0728-build-ca-bundles\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.570956 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f00d7586-332c-485f-b171-5b3f4f7a0728-buildcachedir\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.570976 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f00d7586-332c-485f-b171-5b3f4f7a0728-build-blob-cache\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.570996 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f00d7586-332c-485f-b171-5b3f4f7a0728-container-storage-root\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.571020 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f00d7586-332c-485f-b171-5b3f4f7a0728-buildworkdir\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.571035 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f00d7586-332c-485f-b171-5b3f4f7a0728-node-pullsecrets\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.571056 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f00d7586-332c-485f-b171-5b3f4f7a0728-build-proxy-ca-bundles\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.571076 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/f00d7586-332c-485f-b171-5b3f4f7a0728-builder-dockercfg-kmssz-pull\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.571095 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d889h\" (UniqueName: \"kubernetes.io/projected/f00d7586-332c-485f-b171-5b3f4f7a0728-kube-api-access-d889h\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.571164 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/f00d7586-332c-485f-b171-5b3f4f7a0728-builder-dockercfg-kmssz-push\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.571431 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f00d7586-332c-485f-b171-5b3f4f7a0728-buildcachedir\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.571443 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f00d7586-332c-485f-b171-5b3f4f7a0728-node-pullsecrets\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.571660 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f00d7586-332c-485f-b171-5b3f4f7a0728-container-storage-run\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.572024 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f00d7586-332c-485f-b171-5b3f4f7a0728-buildworkdir\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.572160 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f00d7586-332c-485f-b171-5b3f4f7a0728-container-storage-root\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.572627 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f00d7586-332c-485f-b171-5b3f4f7a0728-build-proxy-ca-bundles\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.572863 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f00d7586-332c-485f-b171-5b3f4f7a0728-build-blob-cache\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.573363 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f00d7586-332c-485f-b171-5b3f4f7a0728-build-system-configs\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.574067 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f00d7586-332c-485f-b171-5b3f4f7a0728-build-ca-bundles\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.578299 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/f00d7586-332c-485f-b171-5b3f4f7a0728-builder-dockercfg-kmssz-push\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.578502 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/f00d7586-332c-485f-b171-5b3f4f7a0728-builder-dockercfg-kmssz-pull\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.603148 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d889h\" (UniqueName: \"kubernetes.io/projected/f00d7586-332c-485f-b171-5b3f4f7a0728-kube-api-access-d889h\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.631387 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:26 crc kubenswrapper[4791]: I0217 00:18:26.133223 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-5-build"] Feb 17 00:18:26 crc kubenswrapper[4791]: I0217 00:18:26.833176 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-5-build" event={"ID":"f00d7586-332c-485f-b171-5b3f4f7a0728","Type":"ContainerStarted","Data":"0bcb2883fbf6e23deec884d193875f95422093db924f3ffdb529bd81606706b9"} Feb 17 00:18:26 crc kubenswrapper[4791]: I0217 00:18:26.833579 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-5-build" event={"ID":"f00d7586-332c-485f-b171-5b3f4f7a0728","Type":"ContainerStarted","Data":"6360b057a63c6515bb6e37784e1d42fc2bc72fdee92da431ce8d64435bcc1761"} Feb 17 00:18:36 crc kubenswrapper[4791]: I0217 00:18:36.911412 4791 generic.go:334] "Generic (PLEG): container finished" podID="f00d7586-332c-485f-b171-5b3f4f7a0728" containerID="0bcb2883fbf6e23deec884d193875f95422093db924f3ffdb529bd81606706b9" exitCode=0 Feb 17 00:18:36 crc kubenswrapper[4791]: I0217 00:18:36.911544 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-5-build" event={"ID":"f00d7586-332c-485f-b171-5b3f4f7a0728","Type":"ContainerDied","Data":"0bcb2883fbf6e23deec884d193875f95422093db924f3ffdb529bd81606706b9"} Feb 17 00:18:37 crc kubenswrapper[4791]: I0217 00:18:37.922992 4791 generic.go:334] "Generic (PLEG): container finished" podID="f00d7586-332c-485f-b171-5b3f4f7a0728" containerID="9ec220b676197175b00a9c30efe79e7cc1a1947917c35b253116c4d516d0c7f3" exitCode=0 Feb 17 00:18:37 crc kubenswrapper[4791]: I0217 00:18:37.923050 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-5-build" event={"ID":"f00d7586-332c-485f-b171-5b3f4f7a0728","Type":"ContainerDied","Data":"9ec220b676197175b00a9c30efe79e7cc1a1947917c35b253116c4d516d0c7f3"} Feb 17 00:18:38 crc kubenswrapper[4791]: I0217 00:18:38.040862 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-5-build_f00d7586-332c-485f-b171-5b3f4f7a0728/manage-dockerfile/0.log" Feb 17 00:18:38 crc kubenswrapper[4791]: I0217 00:18:38.934887 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-5-build" event={"ID":"f00d7586-332c-485f-b171-5b3f4f7a0728","Type":"ContainerStarted","Data":"140e21b65fbe14e821c2dad12e49571665b14cf6c8f81e58c83c173b7ce95626"} Feb 17 00:18:38 crc kubenswrapper[4791]: I0217 00:18:38.978535 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-5-build" podStartSLOduration=13.978505948 podStartE2EDuration="13.978505948s" podCreationTimestamp="2026-02-17 00:18:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:18:38.974158531 +0000 UTC m=+776.453671128" watchObservedRunningTime="2026-02-17 00:18:38.978505948 +0000 UTC m=+776.458018485" Feb 17 00:18:54 crc kubenswrapper[4791]: I0217 00:18:54.973332 4791 patch_prober.go:28] interesting pod/machine-config-daemon-9klkw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 00:18:54 crc kubenswrapper[4791]: I0217 00:18:54.974155 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 00:19:24 crc kubenswrapper[4791]: I0217 00:19:24.973427 4791 patch_prober.go:28] interesting pod/machine-config-daemon-9klkw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 00:19:24 crc kubenswrapper[4791]: I0217 00:19:24.974091 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 00:19:54 crc kubenswrapper[4791]: I0217 00:19:54.973340 4791 patch_prober.go:28] interesting pod/machine-config-daemon-9klkw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 00:19:54 crc kubenswrapper[4791]: I0217 00:19:54.974307 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 00:19:54 crc kubenswrapper[4791]: I0217 00:19:54.974378 4791 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" Feb 17 00:19:54 crc kubenswrapper[4791]: I0217 00:19:54.975289 4791 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"25ad102ac8c402951283e13e1752712dd2cdbf609cf80aba767b2bf348b988eb"} pod="openshift-machine-config-operator/machine-config-daemon-9klkw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 00:19:54 crc kubenswrapper[4791]: I0217 00:19:54.975388 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" containerID="cri-o://25ad102ac8c402951283e13e1752712dd2cdbf609cf80aba767b2bf348b988eb" gracePeriod=600 Feb 17 00:19:55 crc kubenswrapper[4791]: I0217 00:19:55.499374 4791 generic.go:334] "Generic (PLEG): container finished" podID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerID="25ad102ac8c402951283e13e1752712dd2cdbf609cf80aba767b2bf348b988eb" exitCode=0 Feb 17 00:19:55 crc kubenswrapper[4791]: I0217 00:19:55.499514 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" event={"ID":"02a3a228-86d6-4d54-ad63-0d36c9d59af5","Type":"ContainerDied","Data":"25ad102ac8c402951283e13e1752712dd2cdbf609cf80aba767b2bf348b988eb"} Feb 17 00:19:55 crc kubenswrapper[4791]: I0217 00:19:55.499787 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" event={"ID":"02a3a228-86d6-4d54-ad63-0d36c9d59af5","Type":"ContainerStarted","Data":"722b3f07ba9ee7144e67186af8cb544101e6caf0a3dcec49ef92e61b07ce2b64"} Feb 17 00:19:55 crc kubenswrapper[4791]: I0217 00:19:55.499818 4791 scope.go:117] "RemoveContainer" containerID="5711491f64a78b6ffc6b1c8acbd4b44d9c0260c393f0431c6d0856df0b4e56a4" Feb 17 00:20:04 crc kubenswrapper[4791]: I0217 00:20:04.567619 4791 generic.go:334] "Generic (PLEG): container finished" podID="f00d7586-332c-485f-b171-5b3f4f7a0728" containerID="140e21b65fbe14e821c2dad12e49571665b14cf6c8f81e58c83c173b7ce95626" exitCode=0 Feb 17 00:20:04 crc kubenswrapper[4791]: I0217 00:20:04.567673 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-5-build" event={"ID":"f00d7586-332c-485f-b171-5b3f4f7a0728","Type":"ContainerDied","Data":"140e21b65fbe14e821c2dad12e49571665b14cf6c8f81e58c83c173b7ce95626"} Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.508418 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nrrr6"] Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.511165 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nrrr6" Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.525324 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nrrr6"] Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.591382 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db8wq\" (UniqueName: \"kubernetes.io/projected/e466cdef-0ad2-4536-a50f-e323c91438dd-kube-api-access-db8wq\") pod \"certified-operators-nrrr6\" (UID: \"e466cdef-0ad2-4536-a50f-e323c91438dd\") " pod="openshift-marketplace/certified-operators-nrrr6" Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.592129 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e466cdef-0ad2-4536-a50f-e323c91438dd-utilities\") pod \"certified-operators-nrrr6\" (UID: \"e466cdef-0ad2-4536-a50f-e323c91438dd\") " pod="openshift-marketplace/certified-operators-nrrr6" Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.592362 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e466cdef-0ad2-4536-a50f-e323c91438dd-catalog-content\") pod \"certified-operators-nrrr6\" (UID: \"e466cdef-0ad2-4536-a50f-e323c91438dd\") " pod="openshift-marketplace/certified-operators-nrrr6" Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.693371 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e466cdef-0ad2-4536-a50f-e323c91438dd-utilities\") pod \"certified-operators-nrrr6\" (UID: \"e466cdef-0ad2-4536-a50f-e323c91438dd\") " pod="openshift-marketplace/certified-operators-nrrr6" Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.693422 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e466cdef-0ad2-4536-a50f-e323c91438dd-catalog-content\") pod \"certified-operators-nrrr6\" (UID: \"e466cdef-0ad2-4536-a50f-e323c91438dd\") " pod="openshift-marketplace/certified-operators-nrrr6" Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.693457 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-db8wq\" (UniqueName: \"kubernetes.io/projected/e466cdef-0ad2-4536-a50f-e323c91438dd-kube-api-access-db8wq\") pod \"certified-operators-nrrr6\" (UID: \"e466cdef-0ad2-4536-a50f-e323c91438dd\") " pod="openshift-marketplace/certified-operators-nrrr6" Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.694169 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e466cdef-0ad2-4536-a50f-e323c91438dd-catalog-content\") pod \"certified-operators-nrrr6\" (UID: \"e466cdef-0ad2-4536-a50f-e323c91438dd\") " pod="openshift-marketplace/certified-operators-nrrr6" Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.694504 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e466cdef-0ad2-4536-a50f-e323c91438dd-utilities\") pod \"certified-operators-nrrr6\" (UID: \"e466cdef-0ad2-4536-a50f-e323c91438dd\") " pod="openshift-marketplace/certified-operators-nrrr6" Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.720659 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-db8wq\" (UniqueName: \"kubernetes.io/projected/e466cdef-0ad2-4536-a50f-e323c91438dd-kube-api-access-db8wq\") pod \"certified-operators-nrrr6\" (UID: \"e466cdef-0ad2-4536-a50f-e323c91438dd\") " pod="openshift-marketplace/certified-operators-nrrr6" Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.809246 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.871086 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nrrr6" Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.996252 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f00d7586-332c-485f-b171-5b3f4f7a0728-build-proxy-ca-bundles\") pod \"f00d7586-332c-485f-b171-5b3f4f7a0728\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.996319 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f00d7586-332c-485f-b171-5b3f4f7a0728-buildcachedir\") pod \"f00d7586-332c-485f-b171-5b3f4f7a0728\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.996352 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f00d7586-332c-485f-b171-5b3f4f7a0728-buildworkdir\") pod \"f00d7586-332c-485f-b171-5b3f4f7a0728\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.996382 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f00d7586-332c-485f-b171-5b3f4f7a0728-build-system-configs\") pod \"f00d7586-332c-485f-b171-5b3f4f7a0728\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.996415 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f00d7586-332c-485f-b171-5b3f4f7a0728-container-storage-run\") pod \"f00d7586-332c-485f-b171-5b3f4f7a0728\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.996441 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d889h\" (UniqueName: \"kubernetes.io/projected/f00d7586-332c-485f-b171-5b3f4f7a0728-kube-api-access-d889h\") pod \"f00d7586-332c-485f-b171-5b3f4f7a0728\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.996446 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f00d7586-332c-485f-b171-5b3f4f7a0728-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "f00d7586-332c-485f-b171-5b3f4f7a0728" (UID: "f00d7586-332c-485f-b171-5b3f4f7a0728"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.996479 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f00d7586-332c-485f-b171-5b3f4f7a0728-build-blob-cache\") pod \"f00d7586-332c-485f-b171-5b3f4f7a0728\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.996550 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f00d7586-332c-485f-b171-5b3f4f7a0728-node-pullsecrets\") pod \"f00d7586-332c-485f-b171-5b3f4f7a0728\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.996591 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f00d7586-332c-485f-b171-5b3f4f7a0728-container-storage-root\") pod \"f00d7586-332c-485f-b171-5b3f4f7a0728\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.996623 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/f00d7586-332c-485f-b171-5b3f4f7a0728-builder-dockercfg-kmssz-push\") pod \"f00d7586-332c-485f-b171-5b3f4f7a0728\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.996673 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/f00d7586-332c-485f-b171-5b3f4f7a0728-builder-dockercfg-kmssz-pull\") pod \"f00d7586-332c-485f-b171-5b3f4f7a0728\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.996696 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f00d7586-332c-485f-b171-5b3f4f7a0728-build-ca-bundles\") pod \"f00d7586-332c-485f-b171-5b3f4f7a0728\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.996975 4791 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f00d7586-332c-485f-b171-5b3f4f7a0728-buildcachedir\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:05.997348 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f00d7586-332c-485f-b171-5b3f4f7a0728-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "f00d7586-332c-485f-b171-5b3f4f7a0728" (UID: "f00d7586-332c-485f-b171-5b3f4f7a0728"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:05.997352 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f00d7586-332c-485f-b171-5b3f4f7a0728-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "f00d7586-332c-485f-b171-5b3f4f7a0728" (UID: "f00d7586-332c-485f-b171-5b3f4f7a0728"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:05.999150 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f00d7586-332c-485f-b171-5b3f4f7a0728-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "f00d7586-332c-485f-b171-5b3f4f7a0728" (UID: "f00d7586-332c-485f-b171-5b3f4f7a0728"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:06.002823 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f00d7586-332c-485f-b171-5b3f4f7a0728-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "f00d7586-332c-485f-b171-5b3f4f7a0728" (UID: "f00d7586-332c-485f-b171-5b3f4f7a0728"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:06.003315 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f00d7586-332c-485f-b171-5b3f4f7a0728-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "f00d7586-332c-485f-b171-5b3f4f7a0728" (UID: "f00d7586-332c-485f-b171-5b3f4f7a0728"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:06.005386 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f00d7586-332c-485f-b171-5b3f4f7a0728-builder-dockercfg-kmssz-pull" (OuterVolumeSpecName: "builder-dockercfg-kmssz-pull") pod "f00d7586-332c-485f-b171-5b3f4f7a0728" (UID: "f00d7586-332c-485f-b171-5b3f4f7a0728"). InnerVolumeSpecName "builder-dockercfg-kmssz-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:06.006795 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f00d7586-332c-485f-b171-5b3f4f7a0728-kube-api-access-d889h" (OuterVolumeSpecName: "kube-api-access-d889h") pod "f00d7586-332c-485f-b171-5b3f4f7a0728" (UID: "f00d7586-332c-485f-b171-5b3f4f7a0728"). InnerVolumeSpecName "kube-api-access-d889h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:06.020281 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f00d7586-332c-485f-b171-5b3f4f7a0728-builder-dockercfg-kmssz-push" (OuterVolumeSpecName: "builder-dockercfg-kmssz-push") pod "f00d7586-332c-485f-b171-5b3f4f7a0728" (UID: "f00d7586-332c-485f-b171-5b3f4f7a0728"). InnerVolumeSpecName "builder-dockercfg-kmssz-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:06.040198 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f00d7586-332c-485f-b171-5b3f4f7a0728-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "f00d7586-332c-485f-b171-5b3f4f7a0728" (UID: "f00d7586-332c-485f-b171-5b3f4f7a0728"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:06.096099 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nrrr6"] Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:06.097936 4791 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f00d7586-332c-485f-b171-5b3f4f7a0728-buildworkdir\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:06.097960 4791 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f00d7586-332c-485f-b171-5b3f4f7a0728-build-system-configs\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:06.097973 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f00d7586-332c-485f-b171-5b3f4f7a0728-container-storage-run\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:06.097985 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d889h\" (UniqueName: \"kubernetes.io/projected/f00d7586-332c-485f-b171-5b3f4f7a0728-kube-api-access-d889h\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:06.097998 4791 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f00d7586-332c-485f-b171-5b3f4f7a0728-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:06.098010 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/f00d7586-332c-485f-b171-5b3f4f7a0728-builder-dockercfg-kmssz-push\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:06.098022 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/f00d7586-332c-485f-b171-5b3f4f7a0728-builder-dockercfg-kmssz-pull\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:06.098034 4791 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f00d7586-332c-485f-b171-5b3f4f7a0728-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:06.098045 4791 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f00d7586-332c-485f-b171-5b3f4f7a0728-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:06.197453 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f00d7586-332c-485f-b171-5b3f4f7a0728-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "f00d7586-332c-485f-b171-5b3f4f7a0728" (UID: "f00d7586-332c-485f-b171-5b3f4f7a0728"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:06.205913 4791 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f00d7586-332c-485f-b171-5b3f4f7a0728-build-blob-cache\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:06.582280 4791 generic.go:334] "Generic (PLEG): container finished" podID="e466cdef-0ad2-4536-a50f-e323c91438dd" containerID="91023745db29e801135dc6120120cb42a05ec1d76bda30737b7f087a1e9aa42c" exitCode=0 Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:06.582780 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nrrr6" event={"ID":"e466cdef-0ad2-4536-a50f-e323c91438dd","Type":"ContainerDied","Data":"91023745db29e801135dc6120120cb42a05ec1d76bda30737b7f087a1e9aa42c"} Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:06.582838 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nrrr6" event={"ID":"e466cdef-0ad2-4536-a50f-e323c91438dd","Type":"ContainerStarted","Data":"dfba4d631ae2af8e9109e32d876e3a5df34446a9b4d6fbf497cbfc4a9b323fe2"} Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:06.587976 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-5-build" event={"ID":"f00d7586-332c-485f-b171-5b3f4f7a0728","Type":"ContainerDied","Data":"6360b057a63c6515bb6e37784e1d42fc2bc72fdee92da431ce8d64435bcc1761"} Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:06.588021 4791 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6360b057a63c6515bb6e37784e1d42fc2bc72fdee92da431ce8d64435bcc1761" Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:06.588169 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:20:07 crc kubenswrapper[4791]: I0217 00:20:07.597757 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nrrr6" event={"ID":"e466cdef-0ad2-4536-a50f-e323c91438dd","Type":"ContainerStarted","Data":"e00bb2ccc62911d16b1cbd70fbc3e10594268dc34b98b4c3ad2cbcf7038bd13f"} Feb 17 00:20:08 crc kubenswrapper[4791]: I0217 00:20:08.052100 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f00d7586-332c-485f-b171-5b3f4f7a0728-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "f00d7586-332c-485f-b171-5b3f4f7a0728" (UID: "f00d7586-332c-485f-b171-5b3f4f7a0728"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:20:08 crc kubenswrapper[4791]: I0217 00:20:08.134480 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f00d7586-332c-485f-b171-5b3f4f7a0728-container-storage-root\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:08 crc kubenswrapper[4791]: I0217 00:20:08.609925 4791 generic.go:334] "Generic (PLEG): container finished" podID="e466cdef-0ad2-4536-a50f-e323c91438dd" containerID="e00bb2ccc62911d16b1cbd70fbc3e10594268dc34b98b4c3ad2cbcf7038bd13f" exitCode=0 Feb 17 00:20:08 crc kubenswrapper[4791]: I0217 00:20:08.609985 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nrrr6" event={"ID":"e466cdef-0ad2-4536-a50f-e323c91438dd","Type":"ContainerDied","Data":"e00bb2ccc62911d16b1cbd70fbc3e10594268dc34b98b4c3ad2cbcf7038bd13f"} Feb 17 00:20:09 crc kubenswrapper[4791]: I0217 00:20:09.619398 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nrrr6" event={"ID":"e466cdef-0ad2-4536-a50f-e323c91438dd","Type":"ContainerStarted","Data":"14ead6ec9a53495cd0ea78d494f6f2c8c194f21dbb61943e7f960a5569b61ffe"} Feb 17 00:20:09 crc kubenswrapper[4791]: I0217 00:20:09.643405 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nrrr6" podStartSLOduration=2.11913963 podStartE2EDuration="4.643381566s" podCreationTimestamp="2026-02-17 00:20:05 +0000 UTC" firstStartedPulling="2026-02-17 00:20:06.585195676 +0000 UTC m=+864.064708213" lastFinishedPulling="2026-02-17 00:20:09.109437622 +0000 UTC m=+866.588950149" observedRunningTime="2026-02-17 00:20:09.636875011 +0000 UTC m=+867.116387538" watchObservedRunningTime="2026-02-17 00:20:09.643381566 +0000 UTC m=+867.122894103" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.292437 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Feb 17 00:20:10 crc kubenswrapper[4791]: E0217 00:20:10.292736 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f00d7586-332c-485f-b171-5b3f4f7a0728" containerName="manage-dockerfile" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.292755 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="f00d7586-332c-485f-b171-5b3f4f7a0728" containerName="manage-dockerfile" Feb 17 00:20:10 crc kubenswrapper[4791]: E0217 00:20:10.292778 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f00d7586-332c-485f-b171-5b3f4f7a0728" containerName="docker-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.292785 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="f00d7586-332c-485f-b171-5b3f4f7a0728" containerName="docker-build" Feb 17 00:20:10 crc kubenswrapper[4791]: E0217 00:20:10.292794 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f00d7586-332c-485f-b171-5b3f4f7a0728" containerName="git-clone" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.292802 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="f00d7586-332c-485f-b171-5b3f4f7a0728" containerName="git-clone" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.292927 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="f00d7586-332c-485f-b171-5b3f4f7a0728" containerName="docker-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.293664 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.296323 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-global-ca" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.296451 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-kmssz" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.297010 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-ca" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.298249 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-sys-config" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.323479 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.467023 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/bde8eea2-068e-4791-ad76-164945e7d646-builder-dockercfg-kmssz-push\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.467066 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bde8eea2-068e-4791-ad76-164945e7d646-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.467089 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bde8eea2-068e-4791-ad76-164945e7d646-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.467133 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bde8eea2-068e-4791-ad76-164945e7d646-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.467169 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bde8eea2-068e-4791-ad76-164945e7d646-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.467201 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bde8eea2-068e-4791-ad76-164945e7d646-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.467223 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvllj\" (UniqueName: \"kubernetes.io/projected/bde8eea2-068e-4791-ad76-164945e7d646-kube-api-access-mvllj\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.467240 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bde8eea2-068e-4791-ad76-164945e7d646-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.467273 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bde8eea2-068e-4791-ad76-164945e7d646-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.467300 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/bde8eea2-068e-4791-ad76-164945e7d646-builder-dockercfg-kmssz-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.467332 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bde8eea2-068e-4791-ad76-164945e7d646-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.467351 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bde8eea2-068e-4791-ad76-164945e7d646-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.569061 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bde8eea2-068e-4791-ad76-164945e7d646-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.569225 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bde8eea2-068e-4791-ad76-164945e7d646-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.569287 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvllj\" (UniqueName: \"kubernetes.io/projected/bde8eea2-068e-4791-ad76-164945e7d646-kube-api-access-mvllj\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.569347 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bde8eea2-068e-4791-ad76-164945e7d646-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.569394 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bde8eea2-068e-4791-ad76-164945e7d646-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.569439 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/bde8eea2-068e-4791-ad76-164945e7d646-builder-dockercfg-kmssz-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.569508 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bde8eea2-068e-4791-ad76-164945e7d646-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.569515 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bde8eea2-068e-4791-ad76-164945e7d646-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.569550 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bde8eea2-068e-4791-ad76-164945e7d646-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.569737 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/bde8eea2-068e-4791-ad76-164945e7d646-builder-dockercfg-kmssz-push\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.569821 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bde8eea2-068e-4791-ad76-164945e7d646-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.569891 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bde8eea2-068e-4791-ad76-164945e7d646-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.569995 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bde8eea2-068e-4791-ad76-164945e7d646-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.570425 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bde8eea2-068e-4791-ad76-164945e7d646-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.570582 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bde8eea2-068e-4791-ad76-164945e7d646-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.570643 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bde8eea2-068e-4791-ad76-164945e7d646-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.570607 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bde8eea2-068e-4791-ad76-164945e7d646-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.571609 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bde8eea2-068e-4791-ad76-164945e7d646-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.572405 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bde8eea2-068e-4791-ad76-164945e7d646-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.573048 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bde8eea2-068e-4791-ad76-164945e7d646-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.573780 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bde8eea2-068e-4791-ad76-164945e7d646-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.581639 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/bde8eea2-068e-4791-ad76-164945e7d646-builder-dockercfg-kmssz-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.584924 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/bde8eea2-068e-4791-ad76-164945e7d646-builder-dockercfg-kmssz-push\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.597772 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvllj\" (UniqueName: \"kubernetes.io/projected/bde8eea2-068e-4791-ad76-164945e7d646-kube-api-access-mvllj\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.607611 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:11 crc kubenswrapper[4791]: I0217 00:20:11.085007 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Feb 17 00:20:11 crc kubenswrapper[4791]: I0217 00:20:11.657039 4791 generic.go:334] "Generic (PLEG): container finished" podID="bde8eea2-068e-4791-ad76-164945e7d646" containerID="eaf1c3ad960a372f759866559e13a737dcc65f7bae2c093e276d9d26d0d0edb2" exitCode=0 Feb 17 00:20:11 crc kubenswrapper[4791]: I0217 00:20:11.657188 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"bde8eea2-068e-4791-ad76-164945e7d646","Type":"ContainerDied","Data":"eaf1c3ad960a372f759866559e13a737dcc65f7bae2c093e276d9d26d0d0edb2"} Feb 17 00:20:11 crc kubenswrapper[4791]: I0217 00:20:11.657594 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"bde8eea2-068e-4791-ad76-164945e7d646","Type":"ContainerStarted","Data":"1ae461848c21aee33759740caf6db92789b07c617851ddd28d9629d261bba1f0"} Feb 17 00:20:12 crc kubenswrapper[4791]: I0217 00:20:12.666879 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"bde8eea2-068e-4791-ad76-164945e7d646","Type":"ContainerStarted","Data":"4925b2afd79484d489002a7091c946fb9622887c9e930a4e33e8eb6ee1a634c6"} Feb 17 00:20:12 crc kubenswrapper[4791]: I0217 00:20:12.701716 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-1-build" podStartSLOduration=2.701686761 podStartE2EDuration="2.701686761s" podCreationTimestamp="2026-02-17 00:20:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:20:12.69816186 +0000 UTC m=+870.177674387" watchObservedRunningTime="2026-02-17 00:20:12.701686761 +0000 UTC m=+870.181199328" Feb 17 00:20:13 crc kubenswrapper[4791]: I0217 00:20:13.631726 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-djrqd"] Feb 17 00:20:13 crc kubenswrapper[4791]: I0217 00:20:13.634613 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-djrqd" Feb 17 00:20:13 crc kubenswrapper[4791]: I0217 00:20:13.655333 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-djrqd"] Feb 17 00:20:13 crc kubenswrapper[4791]: I0217 00:20:13.812813 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsq4j\" (UniqueName: \"kubernetes.io/projected/79b4304a-5553-411d-a6df-e2af898a22b0-kube-api-access-rsq4j\") pod \"community-operators-djrqd\" (UID: \"79b4304a-5553-411d-a6df-e2af898a22b0\") " pod="openshift-marketplace/community-operators-djrqd" Feb 17 00:20:13 crc kubenswrapper[4791]: I0217 00:20:13.813289 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79b4304a-5553-411d-a6df-e2af898a22b0-utilities\") pod \"community-operators-djrqd\" (UID: \"79b4304a-5553-411d-a6df-e2af898a22b0\") " pod="openshift-marketplace/community-operators-djrqd" Feb 17 00:20:13 crc kubenswrapper[4791]: I0217 00:20:13.813317 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79b4304a-5553-411d-a6df-e2af898a22b0-catalog-content\") pod \"community-operators-djrqd\" (UID: \"79b4304a-5553-411d-a6df-e2af898a22b0\") " pod="openshift-marketplace/community-operators-djrqd" Feb 17 00:20:13 crc kubenswrapper[4791]: I0217 00:20:13.915068 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsq4j\" (UniqueName: \"kubernetes.io/projected/79b4304a-5553-411d-a6df-e2af898a22b0-kube-api-access-rsq4j\") pod \"community-operators-djrqd\" (UID: \"79b4304a-5553-411d-a6df-e2af898a22b0\") " pod="openshift-marketplace/community-operators-djrqd" Feb 17 00:20:13 crc kubenswrapper[4791]: I0217 00:20:13.915223 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79b4304a-5553-411d-a6df-e2af898a22b0-utilities\") pod \"community-operators-djrqd\" (UID: \"79b4304a-5553-411d-a6df-e2af898a22b0\") " pod="openshift-marketplace/community-operators-djrqd" Feb 17 00:20:13 crc kubenswrapper[4791]: I0217 00:20:13.915248 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79b4304a-5553-411d-a6df-e2af898a22b0-catalog-content\") pod \"community-operators-djrqd\" (UID: \"79b4304a-5553-411d-a6df-e2af898a22b0\") " pod="openshift-marketplace/community-operators-djrqd" Feb 17 00:20:13 crc kubenswrapper[4791]: I0217 00:20:13.915749 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79b4304a-5553-411d-a6df-e2af898a22b0-utilities\") pod \"community-operators-djrqd\" (UID: \"79b4304a-5553-411d-a6df-e2af898a22b0\") " pod="openshift-marketplace/community-operators-djrqd" Feb 17 00:20:13 crc kubenswrapper[4791]: I0217 00:20:13.915802 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79b4304a-5553-411d-a6df-e2af898a22b0-catalog-content\") pod \"community-operators-djrqd\" (UID: \"79b4304a-5553-411d-a6df-e2af898a22b0\") " pod="openshift-marketplace/community-operators-djrqd" Feb 17 00:20:13 crc kubenswrapper[4791]: I0217 00:20:13.933855 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsq4j\" (UniqueName: \"kubernetes.io/projected/79b4304a-5553-411d-a6df-e2af898a22b0-kube-api-access-rsq4j\") pod \"community-operators-djrqd\" (UID: \"79b4304a-5553-411d-a6df-e2af898a22b0\") " pod="openshift-marketplace/community-operators-djrqd" Feb 17 00:20:13 crc kubenswrapper[4791]: I0217 00:20:13.955253 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-djrqd" Feb 17 00:20:14 crc kubenswrapper[4791]: I0217 00:20:14.236493 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-djrqd"] Feb 17 00:20:14 crc kubenswrapper[4791]: W0217 00:20:14.240648 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79b4304a_5553_411d_a6df_e2af898a22b0.slice/crio-4a3df3f42d7dbafcfb3a603f7123b1f83835810b6656fb68e16d388e80383efd WatchSource:0}: Error finding container 4a3df3f42d7dbafcfb3a603f7123b1f83835810b6656fb68e16d388e80383efd: Status 404 returned error can't find the container with id 4a3df3f42d7dbafcfb3a603f7123b1f83835810b6656fb68e16d388e80383efd Feb 17 00:20:14 crc kubenswrapper[4791]: I0217 00:20:14.681496 4791 generic.go:334] "Generic (PLEG): container finished" podID="79b4304a-5553-411d-a6df-e2af898a22b0" containerID="6265710aa1598cf2760a216bcb97d3c1d36d120dc17c36049999d2b40b284834" exitCode=0 Feb 17 00:20:14 crc kubenswrapper[4791]: I0217 00:20:14.681598 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djrqd" event={"ID":"79b4304a-5553-411d-a6df-e2af898a22b0","Type":"ContainerDied","Data":"6265710aa1598cf2760a216bcb97d3c1d36d120dc17c36049999d2b40b284834"} Feb 17 00:20:14 crc kubenswrapper[4791]: I0217 00:20:14.681653 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djrqd" event={"ID":"79b4304a-5553-411d-a6df-e2af898a22b0","Type":"ContainerStarted","Data":"4a3df3f42d7dbafcfb3a603f7123b1f83835810b6656fb68e16d388e80383efd"} Feb 17 00:20:15 crc kubenswrapper[4791]: I0217 00:20:15.871692 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nrrr6" Feb 17 00:20:15 crc kubenswrapper[4791]: I0217 00:20:15.872190 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nrrr6" Feb 17 00:20:15 crc kubenswrapper[4791]: I0217 00:20:15.919243 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nrrr6" Feb 17 00:20:16 crc kubenswrapper[4791]: I0217 00:20:16.731756 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nrrr6" Feb 17 00:20:16 crc kubenswrapper[4791]: I0217 00:20:16.992536 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nrrr6"] Feb 17 00:20:18 crc kubenswrapper[4791]: I0217 00:20:18.710268 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nrrr6" podUID="e466cdef-0ad2-4536-a50f-e323c91438dd" containerName="registry-server" containerID="cri-o://14ead6ec9a53495cd0ea78d494f6f2c8c194f21dbb61943e7f960a5569b61ffe" gracePeriod=2 Feb 17 00:20:19 crc kubenswrapper[4791]: I0217 00:20:19.718744 4791 generic.go:334] "Generic (PLEG): container finished" podID="e466cdef-0ad2-4536-a50f-e323c91438dd" containerID="14ead6ec9a53495cd0ea78d494f6f2c8c194f21dbb61943e7f960a5569b61ffe" exitCode=0 Feb 17 00:20:19 crc kubenswrapper[4791]: I0217 00:20:19.718885 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nrrr6" event={"ID":"e466cdef-0ad2-4536-a50f-e323c91438dd","Type":"ContainerDied","Data":"14ead6ec9a53495cd0ea78d494f6f2c8c194f21dbb61943e7f960a5569b61ffe"} Feb 17 00:20:19 crc kubenswrapper[4791]: I0217 00:20:19.721367 4791 generic.go:334] "Generic (PLEG): container finished" podID="79b4304a-5553-411d-a6df-e2af898a22b0" containerID="b1e70ef860385035e08b05e1dac681a8ccfb5e9679b68717933094c9c7d4c761" exitCode=0 Feb 17 00:20:19 crc kubenswrapper[4791]: I0217 00:20:19.721429 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djrqd" event={"ID":"79b4304a-5553-411d-a6df-e2af898a22b0","Type":"ContainerDied","Data":"b1e70ef860385035e08b05e1dac681a8ccfb5e9679b68717933094c9c7d4c761"} Feb 17 00:20:19 crc kubenswrapper[4791]: I0217 00:20:19.874528 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nrrr6" Feb 17 00:20:19 crc kubenswrapper[4791]: I0217 00:20:19.998263 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e466cdef-0ad2-4536-a50f-e323c91438dd-catalog-content\") pod \"e466cdef-0ad2-4536-a50f-e323c91438dd\" (UID: \"e466cdef-0ad2-4536-a50f-e323c91438dd\") " Feb 17 00:20:19 crc kubenswrapper[4791]: I0217 00:20:19.998339 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e466cdef-0ad2-4536-a50f-e323c91438dd-utilities\") pod \"e466cdef-0ad2-4536-a50f-e323c91438dd\" (UID: \"e466cdef-0ad2-4536-a50f-e323c91438dd\") " Feb 17 00:20:19 crc kubenswrapper[4791]: I0217 00:20:19.998469 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-db8wq\" (UniqueName: \"kubernetes.io/projected/e466cdef-0ad2-4536-a50f-e323c91438dd-kube-api-access-db8wq\") pod \"e466cdef-0ad2-4536-a50f-e323c91438dd\" (UID: \"e466cdef-0ad2-4536-a50f-e323c91438dd\") " Feb 17 00:20:19 crc kubenswrapper[4791]: I0217 00:20:19.999334 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e466cdef-0ad2-4536-a50f-e323c91438dd-utilities" (OuterVolumeSpecName: "utilities") pod "e466cdef-0ad2-4536-a50f-e323c91438dd" (UID: "e466cdef-0ad2-4536-a50f-e323c91438dd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:20:20 crc kubenswrapper[4791]: I0217 00:20:20.007276 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e466cdef-0ad2-4536-a50f-e323c91438dd-kube-api-access-db8wq" (OuterVolumeSpecName: "kube-api-access-db8wq") pod "e466cdef-0ad2-4536-a50f-e323c91438dd" (UID: "e466cdef-0ad2-4536-a50f-e323c91438dd"). InnerVolumeSpecName "kube-api-access-db8wq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:20:20 crc kubenswrapper[4791]: I0217 00:20:20.060187 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e466cdef-0ad2-4536-a50f-e323c91438dd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e466cdef-0ad2-4536-a50f-e323c91438dd" (UID: "e466cdef-0ad2-4536-a50f-e323c91438dd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:20:20 crc kubenswrapper[4791]: I0217 00:20:20.100517 4791 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e466cdef-0ad2-4536-a50f-e323c91438dd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:20 crc kubenswrapper[4791]: I0217 00:20:20.100578 4791 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e466cdef-0ad2-4536-a50f-e323c91438dd-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:20 crc kubenswrapper[4791]: I0217 00:20:20.100599 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-db8wq\" (UniqueName: \"kubernetes.io/projected/e466cdef-0ad2-4536-a50f-e323c91438dd-kube-api-access-db8wq\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:20 crc kubenswrapper[4791]: I0217 00:20:20.730310 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nrrr6" event={"ID":"e466cdef-0ad2-4536-a50f-e323c91438dd","Type":"ContainerDied","Data":"dfba4d631ae2af8e9109e32d876e3a5df34446a9b4d6fbf497cbfc4a9b323fe2"} Feb 17 00:20:20 crc kubenswrapper[4791]: I0217 00:20:20.730335 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nrrr6" Feb 17 00:20:20 crc kubenswrapper[4791]: I0217 00:20:20.730666 4791 scope.go:117] "RemoveContainer" containerID="14ead6ec9a53495cd0ea78d494f6f2c8c194f21dbb61943e7f960a5569b61ffe" Feb 17 00:20:20 crc kubenswrapper[4791]: I0217 00:20:20.733746 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djrqd" event={"ID":"79b4304a-5553-411d-a6df-e2af898a22b0","Type":"ContainerStarted","Data":"155fd9bdac7e97704fca02965c46c1bb92205e62adecfc173559e54d00730649"} Feb 17 00:20:20 crc kubenswrapper[4791]: I0217 00:20:20.746429 4791 scope.go:117] "RemoveContainer" containerID="e00bb2ccc62911d16b1cbd70fbc3e10594268dc34b98b4c3ad2cbcf7038bd13f" Feb 17 00:20:20 crc kubenswrapper[4791]: I0217 00:20:20.762848 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-djrqd" podStartSLOduration=2.158952874 podStartE2EDuration="7.762828474s" podCreationTimestamp="2026-02-17 00:20:13 +0000 UTC" firstStartedPulling="2026-02-17 00:20:14.697889501 +0000 UTC m=+872.177402028" lastFinishedPulling="2026-02-17 00:20:20.301765101 +0000 UTC m=+877.781277628" observedRunningTime="2026-02-17 00:20:20.755444382 +0000 UTC m=+878.234956929" watchObservedRunningTime="2026-02-17 00:20:20.762828474 +0000 UTC m=+878.242341011" Feb 17 00:20:20 crc kubenswrapper[4791]: I0217 00:20:20.777204 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nrrr6"] Feb 17 00:20:20 crc kubenswrapper[4791]: I0217 00:20:20.783382 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nrrr6"] Feb 17 00:20:20 crc kubenswrapper[4791]: I0217 00:20:20.791544 4791 scope.go:117] "RemoveContainer" containerID="91023745db29e801135dc6120120cb42a05ec1d76bda30737b7f087a1e9aa42c" Feb 17 00:20:20 crc kubenswrapper[4791]: I0217 00:20:20.835394 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Feb 17 00:20:20 crc kubenswrapper[4791]: I0217 00:20:20.835673 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/smart-gateway-operator-1-build" podUID="bde8eea2-068e-4791-ad76-164945e7d646" containerName="docker-build" containerID="cri-o://4925b2afd79484d489002a7091c946fb9622887c9e930a4e33e8eb6ee1a634c6" gracePeriod=30 Feb 17 00:20:21 crc kubenswrapper[4791]: I0217 00:20:21.234796 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e466cdef-0ad2-4536-a50f-e323c91438dd" path="/var/lib/kubelet/pods/e466cdef-0ad2-4536-a50f-e323c91438dd/volumes" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.416314 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Feb 17 00:20:22 crc kubenswrapper[4791]: E0217 00:20:22.416828 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e466cdef-0ad2-4536-a50f-e323c91438dd" containerName="extract-utilities" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.416842 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="e466cdef-0ad2-4536-a50f-e323c91438dd" containerName="extract-utilities" Feb 17 00:20:22 crc kubenswrapper[4791]: E0217 00:20:22.416853 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e466cdef-0ad2-4536-a50f-e323c91438dd" containerName="extract-content" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.416861 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="e466cdef-0ad2-4536-a50f-e323c91438dd" containerName="extract-content" Feb 17 00:20:22 crc kubenswrapper[4791]: E0217 00:20:22.416878 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e466cdef-0ad2-4536-a50f-e323c91438dd" containerName="registry-server" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.416887 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="e466cdef-0ad2-4536-a50f-e323c91438dd" containerName="registry-server" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.417008 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="e466cdef-0ad2-4536-a50f-e323c91438dd" containerName="registry-server" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.418067 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.419889 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-ca" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.419909 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-sys-config" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.421872 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-global-ca" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.433364 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm2bn\" (UniqueName: \"kubernetes.io/projected/bcdc829d-2304-4576-8cdb-b6a15b577e54-kube-api-access-zm2bn\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.433500 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bcdc829d-2304-4576-8cdb-b6a15b577e54-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.433570 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bcdc829d-2304-4576-8cdb-b6a15b577e54-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.433655 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bcdc829d-2304-4576-8cdb-b6a15b577e54-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.433711 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bcdc829d-2304-4576-8cdb-b6a15b577e54-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.433759 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bcdc829d-2304-4576-8cdb-b6a15b577e54-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.433932 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bcdc829d-2304-4576-8cdb-b6a15b577e54-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.434005 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/bcdc829d-2304-4576-8cdb-b6a15b577e54-builder-dockercfg-kmssz-push\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.434079 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bcdc829d-2304-4576-8cdb-b6a15b577e54-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.434246 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bcdc829d-2304-4576-8cdb-b6a15b577e54-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.434352 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/bcdc829d-2304-4576-8cdb-b6a15b577e54-builder-dockercfg-kmssz-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.434398 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bcdc829d-2304-4576-8cdb-b6a15b577e54-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.437023 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.535966 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bcdc829d-2304-4576-8cdb-b6a15b577e54-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.536005 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bcdc829d-2304-4576-8cdb-b6a15b577e54-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.536026 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bcdc829d-2304-4576-8cdb-b6a15b577e54-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.536045 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bcdc829d-2304-4576-8cdb-b6a15b577e54-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.536076 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bcdc829d-2304-4576-8cdb-b6a15b577e54-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.536099 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/bcdc829d-2304-4576-8cdb-b6a15b577e54-builder-dockercfg-kmssz-push\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.536133 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bcdc829d-2304-4576-8cdb-b6a15b577e54-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.536164 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bcdc829d-2304-4576-8cdb-b6a15b577e54-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.536193 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/bcdc829d-2304-4576-8cdb-b6a15b577e54-builder-dockercfg-kmssz-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.536210 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bcdc829d-2304-4576-8cdb-b6a15b577e54-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.536240 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zm2bn\" (UniqueName: \"kubernetes.io/projected/bcdc829d-2304-4576-8cdb-b6a15b577e54-kube-api-access-zm2bn\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.536262 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bcdc829d-2304-4576-8cdb-b6a15b577e54-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.536200 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bcdc829d-2304-4576-8cdb-b6a15b577e54-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.536449 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bcdc829d-2304-4576-8cdb-b6a15b577e54-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.536654 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bcdc829d-2304-4576-8cdb-b6a15b577e54-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.536729 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bcdc829d-2304-4576-8cdb-b6a15b577e54-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.536952 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bcdc829d-2304-4576-8cdb-b6a15b577e54-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.537186 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bcdc829d-2304-4576-8cdb-b6a15b577e54-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.537270 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bcdc829d-2304-4576-8cdb-b6a15b577e54-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.537479 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bcdc829d-2304-4576-8cdb-b6a15b577e54-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.537661 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bcdc829d-2304-4576-8cdb-b6a15b577e54-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.541794 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/bcdc829d-2304-4576-8cdb-b6a15b577e54-builder-dockercfg-kmssz-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.549000 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/bcdc829d-2304-4576-8cdb-b6a15b577e54-builder-dockercfg-kmssz-push\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.555566 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm2bn\" (UniqueName: \"kubernetes.io/projected/bcdc829d-2304-4576-8cdb-b6a15b577e54-kube-api-access-zm2bn\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.801005 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.104944 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.156338 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-1-build_bde8eea2-068e-4791-ad76-164945e7d646/docker-build/0.log" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.157188 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.249230 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bde8eea2-068e-4791-ad76-164945e7d646-build-system-configs\") pod \"bde8eea2-068e-4791-ad76-164945e7d646\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.249280 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bde8eea2-068e-4791-ad76-164945e7d646-node-pullsecrets\") pod \"bde8eea2-068e-4791-ad76-164945e7d646\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.249324 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bde8eea2-068e-4791-ad76-164945e7d646-container-storage-run\") pod \"bde8eea2-068e-4791-ad76-164945e7d646\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.249360 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/bde8eea2-068e-4791-ad76-164945e7d646-builder-dockercfg-kmssz-push\") pod \"bde8eea2-068e-4791-ad76-164945e7d646\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.249396 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bde8eea2-068e-4791-ad76-164945e7d646-build-blob-cache\") pod \"bde8eea2-068e-4791-ad76-164945e7d646\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.249386 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bde8eea2-068e-4791-ad76-164945e7d646-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "bde8eea2-068e-4791-ad76-164945e7d646" (UID: "bde8eea2-068e-4791-ad76-164945e7d646"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.249419 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bde8eea2-068e-4791-ad76-164945e7d646-container-storage-root\") pod \"bde8eea2-068e-4791-ad76-164945e7d646\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.249452 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/bde8eea2-068e-4791-ad76-164945e7d646-builder-dockercfg-kmssz-pull\") pod \"bde8eea2-068e-4791-ad76-164945e7d646\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.249482 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bde8eea2-068e-4791-ad76-164945e7d646-build-ca-bundles\") pod \"bde8eea2-068e-4791-ad76-164945e7d646\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.249507 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bde8eea2-068e-4791-ad76-164945e7d646-buildcachedir\") pod \"bde8eea2-068e-4791-ad76-164945e7d646\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.249555 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvllj\" (UniqueName: \"kubernetes.io/projected/bde8eea2-068e-4791-ad76-164945e7d646-kube-api-access-mvllj\") pod \"bde8eea2-068e-4791-ad76-164945e7d646\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.249622 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bde8eea2-068e-4791-ad76-164945e7d646-build-proxy-ca-bundles\") pod \"bde8eea2-068e-4791-ad76-164945e7d646\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.249648 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bde8eea2-068e-4791-ad76-164945e7d646-buildworkdir\") pod \"bde8eea2-068e-4791-ad76-164945e7d646\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.249908 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bde8eea2-068e-4791-ad76-164945e7d646-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "bde8eea2-068e-4791-ad76-164945e7d646" (UID: "bde8eea2-068e-4791-ad76-164945e7d646"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.250248 4791 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bde8eea2-068e-4791-ad76-164945e7d646-build-system-configs\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.250269 4791 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bde8eea2-068e-4791-ad76-164945e7d646-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.250557 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bde8eea2-068e-4791-ad76-164945e7d646-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "bde8eea2-068e-4791-ad76-164945e7d646" (UID: "bde8eea2-068e-4791-ad76-164945e7d646"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.250741 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bde8eea2-068e-4791-ad76-164945e7d646-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "bde8eea2-068e-4791-ad76-164945e7d646" (UID: "bde8eea2-068e-4791-ad76-164945e7d646"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.251246 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bde8eea2-068e-4791-ad76-164945e7d646-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "bde8eea2-068e-4791-ad76-164945e7d646" (UID: "bde8eea2-068e-4791-ad76-164945e7d646"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.251301 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bde8eea2-068e-4791-ad76-164945e7d646-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "bde8eea2-068e-4791-ad76-164945e7d646" (UID: "bde8eea2-068e-4791-ad76-164945e7d646"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.251518 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bde8eea2-068e-4791-ad76-164945e7d646-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "bde8eea2-068e-4791-ad76-164945e7d646" (UID: "bde8eea2-068e-4791-ad76-164945e7d646"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.255835 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bde8eea2-068e-4791-ad76-164945e7d646-kube-api-access-mvllj" (OuterVolumeSpecName: "kube-api-access-mvllj") pod "bde8eea2-068e-4791-ad76-164945e7d646" (UID: "bde8eea2-068e-4791-ad76-164945e7d646"). InnerVolumeSpecName "kube-api-access-mvllj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.256395 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bde8eea2-068e-4791-ad76-164945e7d646-builder-dockercfg-kmssz-pull" (OuterVolumeSpecName: "builder-dockercfg-kmssz-pull") pod "bde8eea2-068e-4791-ad76-164945e7d646" (UID: "bde8eea2-068e-4791-ad76-164945e7d646"). InnerVolumeSpecName "builder-dockercfg-kmssz-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.260361 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bde8eea2-068e-4791-ad76-164945e7d646-builder-dockercfg-kmssz-push" (OuterVolumeSpecName: "builder-dockercfg-kmssz-push") pod "bde8eea2-068e-4791-ad76-164945e7d646" (UID: "bde8eea2-068e-4791-ad76-164945e7d646"). InnerVolumeSpecName "builder-dockercfg-kmssz-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.351658 4791 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bde8eea2-068e-4791-ad76-164945e7d646-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.351708 4791 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bde8eea2-068e-4791-ad76-164945e7d646-buildworkdir\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.351728 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bde8eea2-068e-4791-ad76-164945e7d646-container-storage-run\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.351745 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/bde8eea2-068e-4791-ad76-164945e7d646-builder-dockercfg-kmssz-push\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.351764 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/bde8eea2-068e-4791-ad76-164945e7d646-builder-dockercfg-kmssz-pull\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.351783 4791 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bde8eea2-068e-4791-ad76-164945e7d646-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.351799 4791 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bde8eea2-068e-4791-ad76-164945e7d646-buildcachedir\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.351816 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvllj\" (UniqueName: \"kubernetes.io/projected/bde8eea2-068e-4791-ad76-164945e7d646-kube-api-access-mvllj\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.402100 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bde8eea2-068e-4791-ad76-164945e7d646-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "bde8eea2-068e-4791-ad76-164945e7d646" (UID: "bde8eea2-068e-4791-ad76-164945e7d646"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.453493 4791 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bde8eea2-068e-4791-ad76-164945e7d646-build-blob-cache\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.699248 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bde8eea2-068e-4791-ad76-164945e7d646-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "bde8eea2-068e-4791-ad76-164945e7d646" (UID: "bde8eea2-068e-4791-ad76-164945e7d646"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.757388 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bde8eea2-068e-4791-ad76-164945e7d646-container-storage-root\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.769680 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"bcdc829d-2304-4576-8cdb-b6a15b577e54","Type":"ContainerStarted","Data":"6bc390cf261ee2ce905ed54511ecd2d4889323ecd586e5525896cd466846b745"} Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.769734 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"bcdc829d-2304-4576-8cdb-b6a15b577e54","Type":"ContainerStarted","Data":"89476eba0b9817ae2dd5c5b95b35b38fe9d88bc0388a005c9bcf9bbec9490271"} Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.772043 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-1-build_bde8eea2-068e-4791-ad76-164945e7d646/docker-build/0.log" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.772493 4791 generic.go:334] "Generic (PLEG): container finished" podID="bde8eea2-068e-4791-ad76-164945e7d646" containerID="4925b2afd79484d489002a7091c946fb9622887c9e930a4e33e8eb6ee1a634c6" exitCode=1 Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.772515 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"bde8eea2-068e-4791-ad76-164945e7d646","Type":"ContainerDied","Data":"4925b2afd79484d489002a7091c946fb9622887c9e930a4e33e8eb6ee1a634c6"} Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.772530 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"bde8eea2-068e-4791-ad76-164945e7d646","Type":"ContainerDied","Data":"1ae461848c21aee33759740caf6db92789b07c617851ddd28d9629d261bba1f0"} Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.772546 4791 scope.go:117] "RemoveContainer" containerID="4925b2afd79484d489002a7091c946fb9622887c9e930a4e33e8eb6ee1a634c6" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.772639 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.832330 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.850640 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.866481 4791 scope.go:117] "RemoveContainer" containerID="eaf1c3ad960a372f759866559e13a737dcc65f7bae2c093e276d9d26d0d0edb2" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.897603 4791 scope.go:117] "RemoveContainer" containerID="4925b2afd79484d489002a7091c946fb9622887c9e930a4e33e8eb6ee1a634c6" Feb 17 00:20:23 crc kubenswrapper[4791]: E0217 00:20:23.897975 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4925b2afd79484d489002a7091c946fb9622887c9e930a4e33e8eb6ee1a634c6\": container with ID starting with 4925b2afd79484d489002a7091c946fb9622887c9e930a4e33e8eb6ee1a634c6 not found: ID does not exist" containerID="4925b2afd79484d489002a7091c946fb9622887c9e930a4e33e8eb6ee1a634c6" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.898014 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4925b2afd79484d489002a7091c946fb9622887c9e930a4e33e8eb6ee1a634c6"} err="failed to get container status \"4925b2afd79484d489002a7091c946fb9622887c9e930a4e33e8eb6ee1a634c6\": rpc error: code = NotFound desc = could not find container \"4925b2afd79484d489002a7091c946fb9622887c9e930a4e33e8eb6ee1a634c6\": container with ID starting with 4925b2afd79484d489002a7091c946fb9622887c9e930a4e33e8eb6ee1a634c6 not found: ID does not exist" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.898044 4791 scope.go:117] "RemoveContainer" containerID="eaf1c3ad960a372f759866559e13a737dcc65f7bae2c093e276d9d26d0d0edb2" Feb 17 00:20:23 crc kubenswrapper[4791]: E0217 00:20:23.898341 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eaf1c3ad960a372f759866559e13a737dcc65f7bae2c093e276d9d26d0d0edb2\": container with ID starting with eaf1c3ad960a372f759866559e13a737dcc65f7bae2c093e276d9d26d0d0edb2 not found: ID does not exist" containerID="eaf1c3ad960a372f759866559e13a737dcc65f7bae2c093e276d9d26d0d0edb2" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.898367 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaf1c3ad960a372f759866559e13a737dcc65f7bae2c093e276d9d26d0d0edb2"} err="failed to get container status \"eaf1c3ad960a372f759866559e13a737dcc65f7bae2c093e276d9d26d0d0edb2\": rpc error: code = NotFound desc = could not find container \"eaf1c3ad960a372f759866559e13a737dcc65f7bae2c093e276d9d26d0d0edb2\": container with ID starting with eaf1c3ad960a372f759866559e13a737dcc65f7bae2c093e276d9d26d0d0edb2 not found: ID does not exist" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.956295 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-djrqd" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.956425 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-djrqd" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.995288 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-djrqd" Feb 17 00:20:24 crc kubenswrapper[4791]: I0217 00:20:24.790709 4791 generic.go:334] "Generic (PLEG): container finished" podID="bcdc829d-2304-4576-8cdb-b6a15b577e54" containerID="6bc390cf261ee2ce905ed54511ecd2d4889323ecd586e5525896cd466846b745" exitCode=0 Feb 17 00:20:24 crc kubenswrapper[4791]: I0217 00:20:24.790773 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"bcdc829d-2304-4576-8cdb-b6a15b577e54","Type":"ContainerDied","Data":"6bc390cf261ee2ce905ed54511ecd2d4889323ecd586e5525896cd466846b745"} Feb 17 00:20:25 crc kubenswrapper[4791]: I0217 00:20:25.232428 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bde8eea2-068e-4791-ad76-164945e7d646" path="/var/lib/kubelet/pods/bde8eea2-068e-4791-ad76-164945e7d646/volumes" Feb 17 00:20:25 crc kubenswrapper[4791]: I0217 00:20:25.871260 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-djrqd" Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.083484 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-djrqd"] Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.127979 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v6qjq"] Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.128263 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-v6qjq" podUID="eebe5038-a970-42a4-81d4-fa84e6a64dd2" containerName="registry-server" containerID="cri-o://ed7b094d4b0af5cbe89e9c6da70af29a27e1cfc9a98b0fd6d3269416ecfd81c1" gracePeriod=2 Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.487949 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v6qjq" Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.592025 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eebe5038-a970-42a4-81d4-fa84e6a64dd2-catalog-content\") pod \"eebe5038-a970-42a4-81d4-fa84e6a64dd2\" (UID: \"eebe5038-a970-42a4-81d4-fa84e6a64dd2\") " Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.592075 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eebe5038-a970-42a4-81d4-fa84e6a64dd2-utilities\") pod \"eebe5038-a970-42a4-81d4-fa84e6a64dd2\" (UID: \"eebe5038-a970-42a4-81d4-fa84e6a64dd2\") " Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.592222 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-729vq\" (UniqueName: \"kubernetes.io/projected/eebe5038-a970-42a4-81d4-fa84e6a64dd2-kube-api-access-729vq\") pod \"eebe5038-a970-42a4-81d4-fa84e6a64dd2\" (UID: \"eebe5038-a970-42a4-81d4-fa84e6a64dd2\") " Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.593591 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eebe5038-a970-42a4-81d4-fa84e6a64dd2-utilities" (OuterVolumeSpecName: "utilities") pod "eebe5038-a970-42a4-81d4-fa84e6a64dd2" (UID: "eebe5038-a970-42a4-81d4-fa84e6a64dd2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.597575 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eebe5038-a970-42a4-81d4-fa84e6a64dd2-kube-api-access-729vq" (OuterVolumeSpecName: "kube-api-access-729vq") pod "eebe5038-a970-42a4-81d4-fa84e6a64dd2" (UID: "eebe5038-a970-42a4-81d4-fa84e6a64dd2"). InnerVolumeSpecName "kube-api-access-729vq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.643491 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eebe5038-a970-42a4-81d4-fa84e6a64dd2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eebe5038-a970-42a4-81d4-fa84e6a64dd2" (UID: "eebe5038-a970-42a4-81d4-fa84e6a64dd2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.693121 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-729vq\" (UniqueName: \"kubernetes.io/projected/eebe5038-a970-42a4-81d4-fa84e6a64dd2-kube-api-access-729vq\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.693150 4791 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eebe5038-a970-42a4-81d4-fa84e6a64dd2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.693161 4791 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eebe5038-a970-42a4-81d4-fa84e6a64dd2-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.816604 4791 generic.go:334] "Generic (PLEG): container finished" podID="bcdc829d-2304-4576-8cdb-b6a15b577e54" containerID="e80128347dc7c950dc836d68c8b67a7cc01a8e43fb8f74b24f82416156d6c0c1" exitCode=0 Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.816689 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"bcdc829d-2304-4576-8cdb-b6a15b577e54","Type":"ContainerDied","Data":"e80128347dc7c950dc836d68c8b67a7cc01a8e43fb8f74b24f82416156d6c0c1"} Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.818948 4791 generic.go:334] "Generic (PLEG): container finished" podID="eebe5038-a970-42a4-81d4-fa84e6a64dd2" containerID="ed7b094d4b0af5cbe89e9c6da70af29a27e1cfc9a98b0fd6d3269416ecfd81c1" exitCode=0 Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.819030 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v6qjq" Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.819072 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v6qjq" event={"ID":"eebe5038-a970-42a4-81d4-fa84e6a64dd2","Type":"ContainerDied","Data":"ed7b094d4b0af5cbe89e9c6da70af29a27e1cfc9a98b0fd6d3269416ecfd81c1"} Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.819133 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v6qjq" event={"ID":"eebe5038-a970-42a4-81d4-fa84e6a64dd2","Type":"ContainerDied","Data":"c8fc013d2e008f0f951114c42fdaed6119c2933ea57de36f1dcfd1ae325e546c"} Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.819158 4791 scope.go:117] "RemoveContainer" containerID="ed7b094d4b0af5cbe89e9c6da70af29a27e1cfc9a98b0fd6d3269416ecfd81c1" Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.852450 4791 scope.go:117] "RemoveContainer" containerID="d69583cc2d298b1daf6617fbad84270f1593cb5a67be3d06e8883cb64b515f1c" Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.853428 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-2-build_bcdc829d-2304-4576-8cdb-b6a15b577e54/manage-dockerfile/0.log" Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.866599 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v6qjq"] Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.871241 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-v6qjq"] Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.885602 4791 scope.go:117] "RemoveContainer" containerID="737af514e798ea92190e562f1ad92c8289073df535ff899a9da15a87c241f33d" Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.909229 4791 scope.go:117] "RemoveContainer" containerID="ed7b094d4b0af5cbe89e9c6da70af29a27e1cfc9a98b0fd6d3269416ecfd81c1" Feb 17 00:20:26 crc kubenswrapper[4791]: E0217 00:20:26.909583 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed7b094d4b0af5cbe89e9c6da70af29a27e1cfc9a98b0fd6d3269416ecfd81c1\": container with ID starting with ed7b094d4b0af5cbe89e9c6da70af29a27e1cfc9a98b0fd6d3269416ecfd81c1 not found: ID does not exist" containerID="ed7b094d4b0af5cbe89e9c6da70af29a27e1cfc9a98b0fd6d3269416ecfd81c1" Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.909624 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed7b094d4b0af5cbe89e9c6da70af29a27e1cfc9a98b0fd6d3269416ecfd81c1"} err="failed to get container status \"ed7b094d4b0af5cbe89e9c6da70af29a27e1cfc9a98b0fd6d3269416ecfd81c1\": rpc error: code = NotFound desc = could not find container \"ed7b094d4b0af5cbe89e9c6da70af29a27e1cfc9a98b0fd6d3269416ecfd81c1\": container with ID starting with ed7b094d4b0af5cbe89e9c6da70af29a27e1cfc9a98b0fd6d3269416ecfd81c1 not found: ID does not exist" Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.909651 4791 scope.go:117] "RemoveContainer" containerID="d69583cc2d298b1daf6617fbad84270f1593cb5a67be3d06e8883cb64b515f1c" Feb 17 00:20:26 crc kubenswrapper[4791]: E0217 00:20:26.909860 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d69583cc2d298b1daf6617fbad84270f1593cb5a67be3d06e8883cb64b515f1c\": container with ID starting with d69583cc2d298b1daf6617fbad84270f1593cb5a67be3d06e8883cb64b515f1c not found: ID does not exist" containerID="d69583cc2d298b1daf6617fbad84270f1593cb5a67be3d06e8883cb64b515f1c" Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.909880 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d69583cc2d298b1daf6617fbad84270f1593cb5a67be3d06e8883cb64b515f1c"} err="failed to get container status \"d69583cc2d298b1daf6617fbad84270f1593cb5a67be3d06e8883cb64b515f1c\": rpc error: code = NotFound desc = could not find container \"d69583cc2d298b1daf6617fbad84270f1593cb5a67be3d06e8883cb64b515f1c\": container with ID starting with d69583cc2d298b1daf6617fbad84270f1593cb5a67be3d06e8883cb64b515f1c not found: ID does not exist" Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.909893 4791 scope.go:117] "RemoveContainer" containerID="737af514e798ea92190e562f1ad92c8289073df535ff899a9da15a87c241f33d" Feb 17 00:20:26 crc kubenswrapper[4791]: E0217 00:20:26.911704 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"737af514e798ea92190e562f1ad92c8289073df535ff899a9da15a87c241f33d\": container with ID starting with 737af514e798ea92190e562f1ad92c8289073df535ff899a9da15a87c241f33d not found: ID does not exist" containerID="737af514e798ea92190e562f1ad92c8289073df535ff899a9da15a87c241f33d" Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.911730 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"737af514e798ea92190e562f1ad92c8289073df535ff899a9da15a87c241f33d"} err="failed to get container status \"737af514e798ea92190e562f1ad92c8289073df535ff899a9da15a87c241f33d\": rpc error: code = NotFound desc = could not find container \"737af514e798ea92190e562f1ad92c8289073df535ff899a9da15a87c241f33d\": container with ID starting with 737af514e798ea92190e562f1ad92c8289073df535ff899a9da15a87c241f33d not found: ID does not exist" Feb 17 00:20:27 crc kubenswrapper[4791]: I0217 00:20:27.226699 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eebe5038-a970-42a4-81d4-fa84e6a64dd2" path="/var/lib/kubelet/pods/eebe5038-a970-42a4-81d4-fa84e6a64dd2/volumes" Feb 17 00:20:27 crc kubenswrapper[4791]: I0217 00:20:27.829232 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"bcdc829d-2304-4576-8cdb-b6a15b577e54","Type":"ContainerStarted","Data":"29e936b5fcf7ecd6131859059d0b7e9f208b819211cfcd228bed9247a6317ed7"} Feb 17 00:20:27 crc kubenswrapper[4791]: I0217 00:20:27.859471 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-2-build" podStartSLOduration=5.85944953 podStartE2EDuration="5.85944953s" podCreationTimestamp="2026-02-17 00:20:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:20:27.855617091 +0000 UTC m=+885.335129648" watchObservedRunningTime="2026-02-17 00:20:27.85944953 +0000 UTC m=+885.338962067" Feb 17 00:21:06 crc kubenswrapper[4791]: I0217 00:21:06.776786 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-44n5c"] Feb 17 00:21:06 crc kubenswrapper[4791]: E0217 00:21:06.777692 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eebe5038-a970-42a4-81d4-fa84e6a64dd2" containerName="extract-content" Feb 17 00:21:06 crc kubenswrapper[4791]: I0217 00:21:06.777713 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="eebe5038-a970-42a4-81d4-fa84e6a64dd2" containerName="extract-content" Feb 17 00:21:06 crc kubenswrapper[4791]: E0217 00:21:06.777742 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bde8eea2-068e-4791-ad76-164945e7d646" containerName="docker-build" Feb 17 00:21:06 crc kubenswrapper[4791]: I0217 00:21:06.777754 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="bde8eea2-068e-4791-ad76-164945e7d646" containerName="docker-build" Feb 17 00:21:06 crc kubenswrapper[4791]: E0217 00:21:06.777777 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eebe5038-a970-42a4-81d4-fa84e6a64dd2" containerName="registry-server" Feb 17 00:21:06 crc kubenswrapper[4791]: I0217 00:21:06.777790 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="eebe5038-a970-42a4-81d4-fa84e6a64dd2" containerName="registry-server" Feb 17 00:21:06 crc kubenswrapper[4791]: E0217 00:21:06.777806 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bde8eea2-068e-4791-ad76-164945e7d646" containerName="manage-dockerfile" Feb 17 00:21:06 crc kubenswrapper[4791]: I0217 00:21:06.777817 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="bde8eea2-068e-4791-ad76-164945e7d646" containerName="manage-dockerfile" Feb 17 00:21:06 crc kubenswrapper[4791]: E0217 00:21:06.777837 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eebe5038-a970-42a4-81d4-fa84e6a64dd2" containerName="extract-utilities" Feb 17 00:21:06 crc kubenswrapper[4791]: I0217 00:21:06.777849 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="eebe5038-a970-42a4-81d4-fa84e6a64dd2" containerName="extract-utilities" Feb 17 00:21:06 crc kubenswrapper[4791]: I0217 00:21:06.778032 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="eebe5038-a970-42a4-81d4-fa84e6a64dd2" containerName="registry-server" Feb 17 00:21:06 crc kubenswrapper[4791]: I0217 00:21:06.778057 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="bde8eea2-068e-4791-ad76-164945e7d646" containerName="docker-build" Feb 17 00:21:06 crc kubenswrapper[4791]: I0217 00:21:06.779446 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-44n5c" Feb 17 00:21:06 crc kubenswrapper[4791]: I0217 00:21:06.801911 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-44n5c"] Feb 17 00:21:06 crc kubenswrapper[4791]: I0217 00:21:06.967877 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vckfc\" (UniqueName: \"kubernetes.io/projected/eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b-kube-api-access-vckfc\") pod \"redhat-operators-44n5c\" (UID: \"eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b\") " pod="openshift-marketplace/redhat-operators-44n5c" Feb 17 00:21:06 crc kubenswrapper[4791]: I0217 00:21:06.968009 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b-utilities\") pod \"redhat-operators-44n5c\" (UID: \"eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b\") " pod="openshift-marketplace/redhat-operators-44n5c" Feb 17 00:21:06 crc kubenswrapper[4791]: I0217 00:21:06.968030 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b-catalog-content\") pod \"redhat-operators-44n5c\" (UID: \"eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b\") " pod="openshift-marketplace/redhat-operators-44n5c" Feb 17 00:21:07 crc kubenswrapper[4791]: I0217 00:21:07.068889 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b-utilities\") pod \"redhat-operators-44n5c\" (UID: \"eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b\") " pod="openshift-marketplace/redhat-operators-44n5c" Feb 17 00:21:07 crc kubenswrapper[4791]: I0217 00:21:07.068952 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b-catalog-content\") pod \"redhat-operators-44n5c\" (UID: \"eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b\") " pod="openshift-marketplace/redhat-operators-44n5c" Feb 17 00:21:07 crc kubenswrapper[4791]: I0217 00:21:07.069016 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vckfc\" (UniqueName: \"kubernetes.io/projected/eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b-kube-api-access-vckfc\") pod \"redhat-operators-44n5c\" (UID: \"eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b\") " pod="openshift-marketplace/redhat-operators-44n5c" Feb 17 00:21:07 crc kubenswrapper[4791]: I0217 00:21:07.069641 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b-utilities\") pod \"redhat-operators-44n5c\" (UID: \"eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b\") " pod="openshift-marketplace/redhat-operators-44n5c" Feb 17 00:21:07 crc kubenswrapper[4791]: I0217 00:21:07.069663 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b-catalog-content\") pod \"redhat-operators-44n5c\" (UID: \"eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b\") " pod="openshift-marketplace/redhat-operators-44n5c" Feb 17 00:21:07 crc kubenswrapper[4791]: I0217 00:21:07.095178 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vckfc\" (UniqueName: \"kubernetes.io/projected/eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b-kube-api-access-vckfc\") pod \"redhat-operators-44n5c\" (UID: \"eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b\") " pod="openshift-marketplace/redhat-operators-44n5c" Feb 17 00:21:07 crc kubenswrapper[4791]: I0217 00:21:07.110562 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-44n5c" Feb 17 00:21:07 crc kubenswrapper[4791]: I0217 00:21:07.613637 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-44n5c"] Feb 17 00:21:08 crc kubenswrapper[4791]: I0217 00:21:08.126224 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-44n5c" event={"ID":"eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b","Type":"ContainerStarted","Data":"23b09b406bee148561e1d38cf362a00225c7b1ed620ff54a974176c895c0e301"} Feb 17 00:21:09 crc kubenswrapper[4791]: I0217 00:21:09.132963 4791 generic.go:334] "Generic (PLEG): container finished" podID="eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b" containerID="0118fcc105e64c4cd456b945c0f06ad0c5edd0e993300aa490662d727a49395f" exitCode=0 Feb 17 00:21:09 crc kubenswrapper[4791]: I0217 00:21:09.133010 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-44n5c" event={"ID":"eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b","Type":"ContainerDied","Data":"0118fcc105e64c4cd456b945c0f06ad0c5edd0e993300aa490662d727a49395f"} Feb 17 00:21:11 crc kubenswrapper[4791]: I0217 00:21:11.144978 4791 generic.go:334] "Generic (PLEG): container finished" podID="eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b" containerID="de3491349309f815955ef67c9b840a85dc5d3ae5d10a00c027b12e1f08e03d05" exitCode=0 Feb 17 00:21:11 crc kubenswrapper[4791]: I0217 00:21:11.145050 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-44n5c" event={"ID":"eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b","Type":"ContainerDied","Data":"de3491349309f815955ef67c9b840a85dc5d3ae5d10a00c027b12e1f08e03d05"} Feb 17 00:21:12 crc kubenswrapper[4791]: I0217 00:21:12.152906 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-44n5c" event={"ID":"eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b","Type":"ContainerStarted","Data":"bb8cfac13451f44d607670bb6bb11b3ec6e0db4cc96bac02634c96443a0a9a71"} Feb 17 00:21:12 crc kubenswrapper[4791]: I0217 00:21:12.177967 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-44n5c" podStartSLOduration=3.708926854 podStartE2EDuration="6.177944743s" podCreationTimestamp="2026-02-17 00:21:06 +0000 UTC" firstStartedPulling="2026-02-17 00:21:09.136626164 +0000 UTC m=+926.616138691" lastFinishedPulling="2026-02-17 00:21:11.605644053 +0000 UTC m=+929.085156580" observedRunningTime="2026-02-17 00:21:12.173379903 +0000 UTC m=+929.652892430" watchObservedRunningTime="2026-02-17 00:21:12.177944743 +0000 UTC m=+929.657457270" Feb 17 00:21:17 crc kubenswrapper[4791]: I0217 00:21:17.111556 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-44n5c" Feb 17 00:21:17 crc kubenswrapper[4791]: I0217 00:21:17.111970 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-44n5c" Feb 17 00:21:18 crc kubenswrapper[4791]: I0217 00:21:18.156734 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-44n5c" podUID="eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b" containerName="registry-server" probeResult="failure" output=< Feb 17 00:21:18 crc kubenswrapper[4791]: timeout: failed to connect service ":50051" within 1s Feb 17 00:21:18 crc kubenswrapper[4791]: > Feb 17 00:21:27 crc kubenswrapper[4791]: I0217 00:21:27.190926 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-44n5c" Feb 17 00:21:27 crc kubenswrapper[4791]: I0217 00:21:27.251822 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-44n5c" Feb 17 00:21:27 crc kubenswrapper[4791]: I0217 00:21:27.428480 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-44n5c"] Feb 17 00:21:28 crc kubenswrapper[4791]: I0217 00:21:28.278552 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-44n5c" podUID="eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b" containerName="registry-server" containerID="cri-o://bb8cfac13451f44d607670bb6bb11b3ec6e0db4cc96bac02634c96443a0a9a71" gracePeriod=2 Feb 17 00:21:28 crc kubenswrapper[4791]: I0217 00:21:28.675791 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-44n5c" Feb 17 00:21:28 crc kubenswrapper[4791]: I0217 00:21:28.809536 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b-utilities\") pod \"eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b\" (UID: \"eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b\") " Feb 17 00:21:28 crc kubenswrapper[4791]: I0217 00:21:28.809603 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b-catalog-content\") pod \"eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b\" (UID: \"eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b\") " Feb 17 00:21:28 crc kubenswrapper[4791]: I0217 00:21:28.809720 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vckfc\" (UniqueName: \"kubernetes.io/projected/eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b-kube-api-access-vckfc\") pod \"eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b\" (UID: \"eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b\") " Feb 17 00:21:28 crc kubenswrapper[4791]: I0217 00:21:28.810795 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b-utilities" (OuterVolumeSpecName: "utilities") pod "eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b" (UID: "eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:21:28 crc kubenswrapper[4791]: I0217 00:21:28.815971 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b-kube-api-access-vckfc" (OuterVolumeSpecName: "kube-api-access-vckfc") pod "eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b" (UID: "eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b"). InnerVolumeSpecName "kube-api-access-vckfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:21:28 crc kubenswrapper[4791]: I0217 00:21:28.911371 4791 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:28 crc kubenswrapper[4791]: I0217 00:21:28.911400 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vckfc\" (UniqueName: \"kubernetes.io/projected/eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b-kube-api-access-vckfc\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:28 crc kubenswrapper[4791]: I0217 00:21:28.964757 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b" (UID: "eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:21:29 crc kubenswrapper[4791]: I0217 00:21:29.012628 4791 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:29 crc kubenswrapper[4791]: I0217 00:21:29.288144 4791 generic.go:334] "Generic (PLEG): container finished" podID="eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b" containerID="bb8cfac13451f44d607670bb6bb11b3ec6e0db4cc96bac02634c96443a0a9a71" exitCode=0 Feb 17 00:21:29 crc kubenswrapper[4791]: I0217 00:21:29.288192 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-44n5c" event={"ID":"eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b","Type":"ContainerDied","Data":"bb8cfac13451f44d607670bb6bb11b3ec6e0db4cc96bac02634c96443a0a9a71"} Feb 17 00:21:29 crc kubenswrapper[4791]: I0217 00:21:29.288262 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-44n5c" event={"ID":"eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b","Type":"ContainerDied","Data":"23b09b406bee148561e1d38cf362a00225c7b1ed620ff54a974176c895c0e301"} Feb 17 00:21:29 crc kubenswrapper[4791]: I0217 00:21:29.288291 4791 scope.go:117] "RemoveContainer" containerID="bb8cfac13451f44d607670bb6bb11b3ec6e0db4cc96bac02634c96443a0a9a71" Feb 17 00:21:29 crc kubenswrapper[4791]: I0217 00:21:29.288220 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-44n5c" Feb 17 00:21:29 crc kubenswrapper[4791]: I0217 00:21:29.309493 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-44n5c"] Feb 17 00:21:29 crc kubenswrapper[4791]: I0217 00:21:29.315294 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-44n5c"] Feb 17 00:21:29 crc kubenswrapper[4791]: I0217 00:21:29.316369 4791 scope.go:117] "RemoveContainer" containerID="de3491349309f815955ef67c9b840a85dc5d3ae5d10a00c027b12e1f08e03d05" Feb 17 00:21:29 crc kubenswrapper[4791]: I0217 00:21:29.357485 4791 scope.go:117] "RemoveContainer" containerID="0118fcc105e64c4cd456b945c0f06ad0c5edd0e993300aa490662d727a49395f" Feb 17 00:21:29 crc kubenswrapper[4791]: I0217 00:21:29.377142 4791 scope.go:117] "RemoveContainer" containerID="bb8cfac13451f44d607670bb6bb11b3ec6e0db4cc96bac02634c96443a0a9a71" Feb 17 00:21:29 crc kubenswrapper[4791]: E0217 00:21:29.377689 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb8cfac13451f44d607670bb6bb11b3ec6e0db4cc96bac02634c96443a0a9a71\": container with ID starting with bb8cfac13451f44d607670bb6bb11b3ec6e0db4cc96bac02634c96443a0a9a71 not found: ID does not exist" containerID="bb8cfac13451f44d607670bb6bb11b3ec6e0db4cc96bac02634c96443a0a9a71" Feb 17 00:21:29 crc kubenswrapper[4791]: I0217 00:21:29.377726 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb8cfac13451f44d607670bb6bb11b3ec6e0db4cc96bac02634c96443a0a9a71"} err="failed to get container status \"bb8cfac13451f44d607670bb6bb11b3ec6e0db4cc96bac02634c96443a0a9a71\": rpc error: code = NotFound desc = could not find container \"bb8cfac13451f44d607670bb6bb11b3ec6e0db4cc96bac02634c96443a0a9a71\": container with ID starting with bb8cfac13451f44d607670bb6bb11b3ec6e0db4cc96bac02634c96443a0a9a71 not found: ID does not exist" Feb 17 00:21:29 crc kubenswrapper[4791]: I0217 00:21:29.377752 4791 scope.go:117] "RemoveContainer" containerID="de3491349309f815955ef67c9b840a85dc5d3ae5d10a00c027b12e1f08e03d05" Feb 17 00:21:29 crc kubenswrapper[4791]: E0217 00:21:29.378042 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de3491349309f815955ef67c9b840a85dc5d3ae5d10a00c027b12e1f08e03d05\": container with ID starting with de3491349309f815955ef67c9b840a85dc5d3ae5d10a00c027b12e1f08e03d05 not found: ID does not exist" containerID="de3491349309f815955ef67c9b840a85dc5d3ae5d10a00c027b12e1f08e03d05" Feb 17 00:21:29 crc kubenswrapper[4791]: I0217 00:21:29.378070 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de3491349309f815955ef67c9b840a85dc5d3ae5d10a00c027b12e1f08e03d05"} err="failed to get container status \"de3491349309f815955ef67c9b840a85dc5d3ae5d10a00c027b12e1f08e03d05\": rpc error: code = NotFound desc = could not find container \"de3491349309f815955ef67c9b840a85dc5d3ae5d10a00c027b12e1f08e03d05\": container with ID starting with de3491349309f815955ef67c9b840a85dc5d3ae5d10a00c027b12e1f08e03d05 not found: ID does not exist" Feb 17 00:21:29 crc kubenswrapper[4791]: I0217 00:21:29.378087 4791 scope.go:117] "RemoveContainer" containerID="0118fcc105e64c4cd456b945c0f06ad0c5edd0e993300aa490662d727a49395f" Feb 17 00:21:29 crc kubenswrapper[4791]: E0217 00:21:29.378443 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0118fcc105e64c4cd456b945c0f06ad0c5edd0e993300aa490662d727a49395f\": container with ID starting with 0118fcc105e64c4cd456b945c0f06ad0c5edd0e993300aa490662d727a49395f not found: ID does not exist" containerID="0118fcc105e64c4cd456b945c0f06ad0c5edd0e993300aa490662d727a49395f" Feb 17 00:21:29 crc kubenswrapper[4791]: I0217 00:21:29.378485 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0118fcc105e64c4cd456b945c0f06ad0c5edd0e993300aa490662d727a49395f"} err="failed to get container status \"0118fcc105e64c4cd456b945c0f06ad0c5edd0e993300aa490662d727a49395f\": rpc error: code = NotFound desc = could not find container \"0118fcc105e64c4cd456b945c0f06ad0c5edd0e993300aa490662d727a49395f\": container with ID starting with 0118fcc105e64c4cd456b945c0f06ad0c5edd0e993300aa490662d727a49395f not found: ID does not exist" Feb 17 00:21:31 crc kubenswrapper[4791]: I0217 00:21:31.228421 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b" path="/var/lib/kubelet/pods/eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b/volumes" Feb 17 00:21:34 crc kubenswrapper[4791]: I0217 00:21:34.323975 4791 generic.go:334] "Generic (PLEG): container finished" podID="bcdc829d-2304-4576-8cdb-b6a15b577e54" containerID="29e936b5fcf7ecd6131859059d0b7e9f208b819211cfcd228bed9247a6317ed7" exitCode=0 Feb 17 00:21:34 crc kubenswrapper[4791]: I0217 00:21:34.324036 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"bcdc829d-2304-4576-8cdb-b6a15b577e54","Type":"ContainerDied","Data":"29e936b5fcf7ecd6131859059d0b7e9f208b819211cfcd228bed9247a6317ed7"} Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.617249 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.800432 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bcdc829d-2304-4576-8cdb-b6a15b577e54-container-storage-root\") pod \"bcdc829d-2304-4576-8cdb-b6a15b577e54\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.800549 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bcdc829d-2304-4576-8cdb-b6a15b577e54-buildworkdir\") pod \"bcdc829d-2304-4576-8cdb-b6a15b577e54\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.800592 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bcdc829d-2304-4576-8cdb-b6a15b577e54-build-ca-bundles\") pod \"bcdc829d-2304-4576-8cdb-b6a15b577e54\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.800627 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bcdc829d-2304-4576-8cdb-b6a15b577e54-buildcachedir\") pod \"bcdc829d-2304-4576-8cdb-b6a15b577e54\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.800714 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/bcdc829d-2304-4576-8cdb-b6a15b577e54-builder-dockercfg-kmssz-push\") pod \"bcdc829d-2304-4576-8cdb-b6a15b577e54\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.800783 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/bcdc829d-2304-4576-8cdb-b6a15b577e54-builder-dockercfg-kmssz-pull\") pod \"bcdc829d-2304-4576-8cdb-b6a15b577e54\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.800830 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bcdc829d-2304-4576-8cdb-b6a15b577e54-node-pullsecrets\") pod \"bcdc829d-2304-4576-8cdb-b6a15b577e54\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.800901 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bcdc829d-2304-4576-8cdb-b6a15b577e54-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "bcdc829d-2304-4576-8cdb-b6a15b577e54" (UID: "bcdc829d-2304-4576-8cdb-b6a15b577e54"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.800945 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bcdc829d-2304-4576-8cdb-b6a15b577e54-container-storage-run\") pod \"bcdc829d-2304-4576-8cdb-b6a15b577e54\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.801014 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bcdc829d-2304-4576-8cdb-b6a15b577e54-build-blob-cache\") pod \"bcdc829d-2304-4576-8cdb-b6a15b577e54\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.801071 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zm2bn\" (UniqueName: \"kubernetes.io/projected/bcdc829d-2304-4576-8cdb-b6a15b577e54-kube-api-access-zm2bn\") pod \"bcdc829d-2304-4576-8cdb-b6a15b577e54\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.801200 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bcdc829d-2304-4576-8cdb-b6a15b577e54-build-proxy-ca-bundles\") pod \"bcdc829d-2304-4576-8cdb-b6a15b577e54\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.801256 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bcdc829d-2304-4576-8cdb-b6a15b577e54-build-system-configs\") pod \"bcdc829d-2304-4576-8cdb-b6a15b577e54\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.801702 4791 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bcdc829d-2304-4576-8cdb-b6a15b577e54-buildcachedir\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.802560 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcdc829d-2304-4576-8cdb-b6a15b577e54-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "bcdc829d-2304-4576-8cdb-b6a15b577e54" (UID: "bcdc829d-2304-4576-8cdb-b6a15b577e54"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.803360 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bcdc829d-2304-4576-8cdb-b6a15b577e54-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "bcdc829d-2304-4576-8cdb-b6a15b577e54" (UID: "bcdc829d-2304-4576-8cdb-b6a15b577e54"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.803970 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcdc829d-2304-4576-8cdb-b6a15b577e54-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "bcdc829d-2304-4576-8cdb-b6a15b577e54" (UID: "bcdc829d-2304-4576-8cdb-b6a15b577e54"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.804769 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcdc829d-2304-4576-8cdb-b6a15b577e54-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "bcdc829d-2304-4576-8cdb-b6a15b577e54" (UID: "bcdc829d-2304-4576-8cdb-b6a15b577e54"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.805138 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcdc829d-2304-4576-8cdb-b6a15b577e54-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "bcdc829d-2304-4576-8cdb-b6a15b577e54" (UID: "bcdc829d-2304-4576-8cdb-b6a15b577e54"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.805553 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcdc829d-2304-4576-8cdb-b6a15b577e54-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "bcdc829d-2304-4576-8cdb-b6a15b577e54" (UID: "bcdc829d-2304-4576-8cdb-b6a15b577e54"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.807727 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcdc829d-2304-4576-8cdb-b6a15b577e54-builder-dockercfg-kmssz-push" (OuterVolumeSpecName: "builder-dockercfg-kmssz-push") pod "bcdc829d-2304-4576-8cdb-b6a15b577e54" (UID: "bcdc829d-2304-4576-8cdb-b6a15b577e54"). InnerVolumeSpecName "builder-dockercfg-kmssz-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.808753 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcdc829d-2304-4576-8cdb-b6a15b577e54-builder-dockercfg-kmssz-pull" (OuterVolumeSpecName: "builder-dockercfg-kmssz-pull") pod "bcdc829d-2304-4576-8cdb-b6a15b577e54" (UID: "bcdc829d-2304-4576-8cdb-b6a15b577e54"). InnerVolumeSpecName "builder-dockercfg-kmssz-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.809486 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcdc829d-2304-4576-8cdb-b6a15b577e54-kube-api-access-zm2bn" (OuterVolumeSpecName: "kube-api-access-zm2bn") pod "bcdc829d-2304-4576-8cdb-b6a15b577e54" (UID: "bcdc829d-2304-4576-8cdb-b6a15b577e54"). InnerVolumeSpecName "kube-api-access-zm2bn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.902694 4791 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bcdc829d-2304-4576-8cdb-b6a15b577e54-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.902731 4791 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bcdc829d-2304-4576-8cdb-b6a15b577e54-build-system-configs\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.902744 4791 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bcdc829d-2304-4576-8cdb-b6a15b577e54-buildworkdir\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.902755 4791 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bcdc829d-2304-4576-8cdb-b6a15b577e54-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.902766 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/bcdc829d-2304-4576-8cdb-b6a15b577e54-builder-dockercfg-kmssz-push\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.902778 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/bcdc829d-2304-4576-8cdb-b6a15b577e54-builder-dockercfg-kmssz-pull\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.902788 4791 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bcdc829d-2304-4576-8cdb-b6a15b577e54-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.902800 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bcdc829d-2304-4576-8cdb-b6a15b577e54-container-storage-run\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.902811 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zm2bn\" (UniqueName: \"kubernetes.io/projected/bcdc829d-2304-4576-8cdb-b6a15b577e54-kube-api-access-zm2bn\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:36 crc kubenswrapper[4791]: I0217 00:21:36.001064 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcdc829d-2304-4576-8cdb-b6a15b577e54-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "bcdc829d-2304-4576-8cdb-b6a15b577e54" (UID: "bcdc829d-2304-4576-8cdb-b6a15b577e54"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:21:36 crc kubenswrapper[4791]: I0217 00:21:36.003521 4791 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bcdc829d-2304-4576-8cdb-b6a15b577e54-build-blob-cache\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:36 crc kubenswrapper[4791]: I0217 00:21:36.340665 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"bcdc829d-2304-4576-8cdb-b6a15b577e54","Type":"ContainerDied","Data":"89476eba0b9817ae2dd5c5b95b35b38fe9d88bc0388a005c9bcf9bbec9490271"} Feb 17 00:21:36 crc kubenswrapper[4791]: I0217 00:21:36.341141 4791 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89476eba0b9817ae2dd5c5b95b35b38fe9d88bc0388a005c9bcf9bbec9490271" Feb 17 00:21:36 crc kubenswrapper[4791]: I0217 00:21:36.340720 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:21:37 crc kubenswrapper[4791]: I0217 00:21:37.997862 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcdc829d-2304-4576-8cdb-b6a15b577e54-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "bcdc829d-2304-4576-8cdb-b6a15b577e54" (UID: "bcdc829d-2304-4576-8cdb-b6a15b577e54"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:21:38 crc kubenswrapper[4791]: I0217 00:21:38.052994 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bcdc829d-2304-4576-8cdb-b6a15b577e54-container-storage-root\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.313083 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-core-1-build"] Feb 17 00:21:40 crc kubenswrapper[4791]: E0217 00:21:40.314045 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcdc829d-2304-4576-8cdb-b6a15b577e54" containerName="manage-dockerfile" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.314077 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcdc829d-2304-4576-8cdb-b6a15b577e54" containerName="manage-dockerfile" Feb 17 00:21:40 crc kubenswrapper[4791]: E0217 00:21:40.314102 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b" containerName="extract-utilities" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.314149 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b" containerName="extract-utilities" Feb 17 00:21:40 crc kubenswrapper[4791]: E0217 00:21:40.314167 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b" containerName="extract-content" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.314183 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b" containerName="extract-content" Feb 17 00:21:40 crc kubenswrapper[4791]: E0217 00:21:40.314209 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcdc829d-2304-4576-8cdb-b6a15b577e54" containerName="git-clone" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.314224 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcdc829d-2304-4576-8cdb-b6a15b577e54" containerName="git-clone" Feb 17 00:21:40 crc kubenswrapper[4791]: E0217 00:21:40.314257 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcdc829d-2304-4576-8cdb-b6a15b577e54" containerName="docker-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.314272 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcdc829d-2304-4576-8cdb-b6a15b577e54" containerName="docker-build" Feb 17 00:21:40 crc kubenswrapper[4791]: E0217 00:21:40.314294 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b" containerName="registry-server" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.314309 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b" containerName="registry-server" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.314553 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b" containerName="registry-server" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.314586 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcdc829d-2304-4576-8cdb-b6a15b577e54" containerName="docker-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.315852 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.318318 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-global-ca" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.319373 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-kmssz" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.319837 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-sys-config" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.320441 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-ca" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.333990 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-1-build"] Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.381248 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f212a215-55c7-48e3-a353-e0b74a390123-buildworkdir\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.381321 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/f212a215-55c7-48e3-a353-e0b74a390123-builder-dockercfg-kmssz-push\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.381393 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f212a215-55c7-48e3-a353-e0b74a390123-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.381635 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f212a215-55c7-48e3-a353-e0b74a390123-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.381763 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f212a215-55c7-48e3-a353-e0b74a390123-build-system-configs\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.381830 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdzws\" (UniqueName: \"kubernetes.io/projected/f212a215-55c7-48e3-a353-e0b74a390123-kube-api-access-kdzws\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.381898 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/f212a215-55c7-48e3-a353-e0b74a390123-builder-dockercfg-kmssz-pull\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.381965 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f212a215-55c7-48e3-a353-e0b74a390123-buildcachedir\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.382087 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f212a215-55c7-48e3-a353-e0b74a390123-container-storage-root\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.382208 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f212a215-55c7-48e3-a353-e0b74a390123-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.382283 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f212a215-55c7-48e3-a353-e0b74a390123-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.382377 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f212a215-55c7-48e3-a353-e0b74a390123-container-storage-run\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.483144 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/f212a215-55c7-48e3-a353-e0b74a390123-builder-dockercfg-kmssz-pull\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.483406 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f212a215-55c7-48e3-a353-e0b74a390123-buildcachedir\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.483497 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f212a215-55c7-48e3-a353-e0b74a390123-container-storage-root\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.483577 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f212a215-55c7-48e3-a353-e0b74a390123-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.483516 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f212a215-55c7-48e3-a353-e0b74a390123-buildcachedir\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.483707 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f212a215-55c7-48e3-a353-e0b74a390123-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.483783 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f212a215-55c7-48e3-a353-e0b74a390123-container-storage-run\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.483867 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f212a215-55c7-48e3-a353-e0b74a390123-buildworkdir\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.483935 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/f212a215-55c7-48e3-a353-e0b74a390123-builder-dockercfg-kmssz-push\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.484021 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f212a215-55c7-48e3-a353-e0b74a390123-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.484102 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f212a215-55c7-48e3-a353-e0b74a390123-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.484220 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f212a215-55c7-48e3-a353-e0b74a390123-build-system-configs\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.484279 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f212a215-55c7-48e3-a353-e0b74a390123-buildworkdir\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.484278 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f212a215-55c7-48e3-a353-e0b74a390123-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.484301 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdzws\" (UniqueName: \"kubernetes.io/projected/f212a215-55c7-48e3-a353-e0b74a390123-kube-api-access-kdzws\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.484397 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f212a215-55c7-48e3-a353-e0b74a390123-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.484421 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f212a215-55c7-48e3-a353-e0b74a390123-container-storage-run\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.484699 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f212a215-55c7-48e3-a353-e0b74a390123-container-storage-root\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.485013 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f212a215-55c7-48e3-a353-e0b74a390123-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.485037 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f212a215-55c7-48e3-a353-e0b74a390123-build-system-configs\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.485647 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f212a215-55c7-48e3-a353-e0b74a390123-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.489860 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/f212a215-55c7-48e3-a353-e0b74a390123-builder-dockercfg-kmssz-push\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.492738 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/f212a215-55c7-48e3-a353-e0b74a390123-builder-dockercfg-kmssz-pull\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.508499 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdzws\" (UniqueName: \"kubernetes.io/projected/f212a215-55c7-48e3-a353-e0b74a390123-kube-api-access-kdzws\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.635391 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Feb 17 00:21:41 crc kubenswrapper[4791]: I0217 00:21:41.078367 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-1-build"] Feb 17 00:21:41 crc kubenswrapper[4791]: I0217 00:21:41.370841 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"f212a215-55c7-48e3-a353-e0b74a390123","Type":"ContainerStarted","Data":"f9d209a659133287def75fe03f5efef7f796bd353ea14cd66a2fcbc9742847c4"} Feb 17 00:21:41 crc kubenswrapper[4791]: I0217 00:21:41.371967 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"f212a215-55c7-48e3-a353-e0b74a390123","Type":"ContainerStarted","Data":"cbdee32da39609493a5fdab3853bbd124ef1afb74d05c58293494e87a42008c0"} Feb 17 00:21:42 crc kubenswrapper[4791]: I0217 00:21:42.383187 4791 generic.go:334] "Generic (PLEG): container finished" podID="f212a215-55c7-48e3-a353-e0b74a390123" containerID="f9d209a659133287def75fe03f5efef7f796bd353ea14cd66a2fcbc9742847c4" exitCode=0 Feb 17 00:21:42 crc kubenswrapper[4791]: I0217 00:21:42.383313 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"f212a215-55c7-48e3-a353-e0b74a390123","Type":"ContainerDied","Data":"f9d209a659133287def75fe03f5efef7f796bd353ea14cd66a2fcbc9742847c4"} Feb 17 00:21:43 crc kubenswrapper[4791]: I0217 00:21:43.394672 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"f212a215-55c7-48e3-a353-e0b74a390123","Type":"ContainerStarted","Data":"e3afe8bc6f0100bd6fcf4687b5ffad8fff4ab447c4fb2f7b16b89ed9c6a182fa"} Feb 17 00:21:43 crc kubenswrapper[4791]: I0217 00:21:43.429972 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-core-1-build" podStartSLOduration=3.429947323 podStartE2EDuration="3.429947323s" podCreationTimestamp="2026-02-17 00:21:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:21:43.429500719 +0000 UTC m=+960.909013266" watchObservedRunningTime="2026-02-17 00:21:43.429947323 +0000 UTC m=+960.909459860" Feb 17 00:21:50 crc kubenswrapper[4791]: I0217 00:21:50.908030 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-core-1-build"] Feb 17 00:21:50 crc kubenswrapper[4791]: I0217 00:21:50.908910 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/sg-core-1-build" podUID="f212a215-55c7-48e3-a353-e0b74a390123" containerName="docker-build" containerID="cri-o://e3afe8bc6f0100bd6fcf4687b5ffad8fff4ab447c4fb2f7b16b89ed9c6a182fa" gracePeriod=30 Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.264715 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-1-build_f212a215-55c7-48e3-a353-e0b74a390123/docker-build/0.log" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.265637 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.353732 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/f212a215-55c7-48e3-a353-e0b74a390123-builder-dockercfg-kmssz-pull\") pod \"f212a215-55c7-48e3-a353-e0b74a390123\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.354302 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f212a215-55c7-48e3-a353-e0b74a390123-build-proxy-ca-bundles\") pod \"f212a215-55c7-48e3-a353-e0b74a390123\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.354338 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f212a215-55c7-48e3-a353-e0b74a390123-build-ca-bundles\") pod \"f212a215-55c7-48e3-a353-e0b74a390123\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.354390 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/f212a215-55c7-48e3-a353-e0b74a390123-builder-dockercfg-kmssz-push\") pod \"f212a215-55c7-48e3-a353-e0b74a390123\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.354481 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f212a215-55c7-48e3-a353-e0b74a390123-container-storage-run\") pod \"f212a215-55c7-48e3-a353-e0b74a390123\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.354575 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f212a215-55c7-48e3-a353-e0b74a390123-container-storage-root\") pod \"f212a215-55c7-48e3-a353-e0b74a390123\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.354629 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f212a215-55c7-48e3-a353-e0b74a390123-buildcachedir\") pod \"f212a215-55c7-48e3-a353-e0b74a390123\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.354695 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f212a215-55c7-48e3-a353-e0b74a390123-buildworkdir\") pod \"f212a215-55c7-48e3-a353-e0b74a390123\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.354728 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f212a215-55c7-48e3-a353-e0b74a390123-build-blob-cache\") pod \"f212a215-55c7-48e3-a353-e0b74a390123\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.354803 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdzws\" (UniqueName: \"kubernetes.io/projected/f212a215-55c7-48e3-a353-e0b74a390123-kube-api-access-kdzws\") pod \"f212a215-55c7-48e3-a353-e0b74a390123\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.354845 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f212a215-55c7-48e3-a353-e0b74a390123-node-pullsecrets\") pod \"f212a215-55c7-48e3-a353-e0b74a390123\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.354879 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f212a215-55c7-48e3-a353-e0b74a390123-build-system-configs\") pod \"f212a215-55c7-48e3-a353-e0b74a390123\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.355191 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f212a215-55c7-48e3-a353-e0b74a390123-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "f212a215-55c7-48e3-a353-e0b74a390123" (UID: "f212a215-55c7-48e3-a353-e0b74a390123"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.355206 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f212a215-55c7-48e3-a353-e0b74a390123-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "f212a215-55c7-48e3-a353-e0b74a390123" (UID: "f212a215-55c7-48e3-a353-e0b74a390123"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.355265 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f212a215-55c7-48e3-a353-e0b74a390123-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "f212a215-55c7-48e3-a353-e0b74a390123" (UID: "f212a215-55c7-48e3-a353-e0b74a390123"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.355642 4791 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f212a215-55c7-48e3-a353-e0b74a390123-buildcachedir\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.355673 4791 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f212a215-55c7-48e3-a353-e0b74a390123-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.355687 4791 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f212a215-55c7-48e3-a353-e0b74a390123-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.355880 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f212a215-55c7-48e3-a353-e0b74a390123-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "f212a215-55c7-48e3-a353-e0b74a390123" (UID: "f212a215-55c7-48e3-a353-e0b74a390123"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.355955 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f212a215-55c7-48e3-a353-e0b74a390123-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "f212a215-55c7-48e3-a353-e0b74a390123" (UID: "f212a215-55c7-48e3-a353-e0b74a390123"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.356673 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f212a215-55c7-48e3-a353-e0b74a390123-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "f212a215-55c7-48e3-a353-e0b74a390123" (UID: "f212a215-55c7-48e3-a353-e0b74a390123"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.358914 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f212a215-55c7-48e3-a353-e0b74a390123-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "f212a215-55c7-48e3-a353-e0b74a390123" (UID: "f212a215-55c7-48e3-a353-e0b74a390123"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.363680 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f212a215-55c7-48e3-a353-e0b74a390123-kube-api-access-kdzws" (OuterVolumeSpecName: "kube-api-access-kdzws") pod "f212a215-55c7-48e3-a353-e0b74a390123" (UID: "f212a215-55c7-48e3-a353-e0b74a390123"). InnerVolumeSpecName "kube-api-access-kdzws". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.363711 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f212a215-55c7-48e3-a353-e0b74a390123-builder-dockercfg-kmssz-pull" (OuterVolumeSpecName: "builder-dockercfg-kmssz-pull") pod "f212a215-55c7-48e3-a353-e0b74a390123" (UID: "f212a215-55c7-48e3-a353-e0b74a390123"). InnerVolumeSpecName "builder-dockercfg-kmssz-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.364828 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f212a215-55c7-48e3-a353-e0b74a390123-builder-dockercfg-kmssz-push" (OuterVolumeSpecName: "builder-dockercfg-kmssz-push") pod "f212a215-55c7-48e3-a353-e0b74a390123" (UID: "f212a215-55c7-48e3-a353-e0b74a390123"). InnerVolumeSpecName "builder-dockercfg-kmssz-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.457357 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/f212a215-55c7-48e3-a353-e0b74a390123-builder-dockercfg-kmssz-push\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.457875 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f212a215-55c7-48e3-a353-e0b74a390123-container-storage-run\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.457962 4791 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f212a215-55c7-48e3-a353-e0b74a390123-buildworkdir\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.458056 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdzws\" (UniqueName: \"kubernetes.io/projected/f212a215-55c7-48e3-a353-e0b74a390123-kube-api-access-kdzws\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.458152 4791 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f212a215-55c7-48e3-a353-e0b74a390123-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.458217 4791 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f212a215-55c7-48e3-a353-e0b74a390123-build-system-configs\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.458275 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/f212a215-55c7-48e3-a353-e0b74a390123-builder-dockercfg-kmssz-pull\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.459533 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f212a215-55c7-48e3-a353-e0b74a390123-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "f212a215-55c7-48e3-a353-e0b74a390123" (UID: "f212a215-55c7-48e3-a353-e0b74a390123"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.468899 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-1-build_f212a215-55c7-48e3-a353-e0b74a390123/docker-build/0.log" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.469815 4791 generic.go:334] "Generic (PLEG): container finished" podID="f212a215-55c7-48e3-a353-e0b74a390123" containerID="e3afe8bc6f0100bd6fcf4687b5ffad8fff4ab447c4fb2f7b16b89ed9c6a182fa" exitCode=1 Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.469955 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"f212a215-55c7-48e3-a353-e0b74a390123","Type":"ContainerDied","Data":"e3afe8bc6f0100bd6fcf4687b5ffad8fff4ab447c4fb2f7b16b89ed9c6a182fa"} Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.470056 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"f212a215-55c7-48e3-a353-e0b74a390123","Type":"ContainerDied","Data":"cbdee32da39609493a5fdab3853bbd124ef1afb74d05c58293494e87a42008c0"} Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.470185 4791 scope.go:117] "RemoveContainer" containerID="e3afe8bc6f0100bd6fcf4687b5ffad8fff4ab447c4fb2f7b16b89ed9c6a182fa" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.469988 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.497348 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f212a215-55c7-48e3-a353-e0b74a390123-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "f212a215-55c7-48e3-a353-e0b74a390123" (UID: "f212a215-55c7-48e3-a353-e0b74a390123"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.504350 4791 scope.go:117] "RemoveContainer" containerID="f9d209a659133287def75fe03f5efef7f796bd353ea14cd66a2fcbc9742847c4" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.534133 4791 scope.go:117] "RemoveContainer" containerID="e3afe8bc6f0100bd6fcf4687b5ffad8fff4ab447c4fb2f7b16b89ed9c6a182fa" Feb 17 00:21:51 crc kubenswrapper[4791]: E0217 00:21:51.534628 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3afe8bc6f0100bd6fcf4687b5ffad8fff4ab447c4fb2f7b16b89ed9c6a182fa\": container with ID starting with e3afe8bc6f0100bd6fcf4687b5ffad8fff4ab447c4fb2f7b16b89ed9c6a182fa not found: ID does not exist" containerID="e3afe8bc6f0100bd6fcf4687b5ffad8fff4ab447c4fb2f7b16b89ed9c6a182fa" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.534681 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3afe8bc6f0100bd6fcf4687b5ffad8fff4ab447c4fb2f7b16b89ed9c6a182fa"} err="failed to get container status \"e3afe8bc6f0100bd6fcf4687b5ffad8fff4ab447c4fb2f7b16b89ed9c6a182fa\": rpc error: code = NotFound desc = could not find container \"e3afe8bc6f0100bd6fcf4687b5ffad8fff4ab447c4fb2f7b16b89ed9c6a182fa\": container with ID starting with e3afe8bc6f0100bd6fcf4687b5ffad8fff4ab447c4fb2f7b16b89ed9c6a182fa not found: ID does not exist" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.534706 4791 scope.go:117] "RemoveContainer" containerID="f9d209a659133287def75fe03f5efef7f796bd353ea14cd66a2fcbc9742847c4" Feb 17 00:21:51 crc kubenswrapper[4791]: E0217 00:21:51.535045 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9d209a659133287def75fe03f5efef7f796bd353ea14cd66a2fcbc9742847c4\": container with ID starting with f9d209a659133287def75fe03f5efef7f796bd353ea14cd66a2fcbc9742847c4 not found: ID does not exist" containerID="f9d209a659133287def75fe03f5efef7f796bd353ea14cd66a2fcbc9742847c4" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.535088 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9d209a659133287def75fe03f5efef7f796bd353ea14cd66a2fcbc9742847c4"} err="failed to get container status \"f9d209a659133287def75fe03f5efef7f796bd353ea14cd66a2fcbc9742847c4\": rpc error: code = NotFound desc = could not find container \"f9d209a659133287def75fe03f5efef7f796bd353ea14cd66a2fcbc9742847c4\": container with ID starting with f9d209a659133287def75fe03f5efef7f796bd353ea14cd66a2fcbc9742847c4 not found: ID does not exist" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.559688 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f212a215-55c7-48e3-a353-e0b74a390123-container-storage-root\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.559717 4791 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f212a215-55c7-48e3-a353-e0b74a390123-build-blob-cache\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.829381 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-core-1-build"] Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.838828 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/sg-core-1-build"] Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.499024 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-core-2-build"] Feb 17 00:21:52 crc kubenswrapper[4791]: E0217 00:21:52.499372 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f212a215-55c7-48e3-a353-e0b74a390123" containerName="manage-dockerfile" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.499395 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="f212a215-55c7-48e3-a353-e0b74a390123" containerName="manage-dockerfile" Feb 17 00:21:52 crc kubenswrapper[4791]: E0217 00:21:52.499417 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f212a215-55c7-48e3-a353-e0b74a390123" containerName="docker-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.499430 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="f212a215-55c7-48e3-a353-e0b74a390123" containerName="docker-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.499619 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="f212a215-55c7-48e3-a353-e0b74a390123" containerName="docker-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.500801 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.502942 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-ca" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.503179 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-kmssz" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.503440 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-global-ca" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.508525 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-sys-config" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.540280 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-2-build"] Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.572534 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.572874 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kgz9\" (UniqueName: \"kubernetes.io/projected/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-kube-api-access-9kgz9\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.573000 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-build-system-configs\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.573158 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-builder-dockercfg-kmssz-push\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.573305 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.573413 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-container-storage-run\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.573505 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-container-storage-root\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.573621 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-builder-dockercfg-kmssz-pull\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.573723 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-buildcachedir\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.573826 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.573920 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.574020 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-buildworkdir\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.675942 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-builder-dockercfg-kmssz-push\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.676640 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.676756 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-container-storage-run\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.676814 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-container-storage-root\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.676854 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.676886 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-builder-dockercfg-kmssz-pull\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.676970 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-buildcachedir\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.677024 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.677075 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.677164 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-buildworkdir\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.677211 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.677264 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kgz9\" (UniqueName: \"kubernetes.io/projected/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-kube-api-access-9kgz9\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.677301 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-build-system-configs\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.678008 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-buildcachedir\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.678067 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-container-storage-run\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.678247 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-container-storage-root\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.678482 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-build-system-configs\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.678566 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.678594 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-buildworkdir\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.679236 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.679679 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.682226 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-builder-dockercfg-kmssz-pull\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.684600 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-builder-dockercfg-kmssz-push\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.712393 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kgz9\" (UniqueName: \"kubernetes.io/projected/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-kube-api-access-9kgz9\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.829245 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Feb 17 00:21:53 crc kubenswrapper[4791]: I0217 00:21:53.235028 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f212a215-55c7-48e3-a353-e0b74a390123" path="/var/lib/kubelet/pods/f212a215-55c7-48e3-a353-e0b74a390123/volumes" Feb 17 00:21:53 crc kubenswrapper[4791]: I0217 00:21:53.309766 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-2-build"] Feb 17 00:21:53 crc kubenswrapper[4791]: I0217 00:21:53.491078 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa","Type":"ContainerStarted","Data":"95153ff0c73c10588f012d75771a6f42fab1cc01a858b352d5f294ae0d3f9091"} Feb 17 00:21:54 crc kubenswrapper[4791]: I0217 00:21:54.502678 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa","Type":"ContainerStarted","Data":"dc5c21e29377746db37307f253136c118eeb822bad13d9daca12b013030f9b79"} Feb 17 00:21:55 crc kubenswrapper[4791]: I0217 00:21:55.514158 4791 generic.go:334] "Generic (PLEG): container finished" podID="eaaeddf5-72cd-46f1-b62e-83bf81db9dfa" containerID="dc5c21e29377746db37307f253136c118eeb822bad13d9daca12b013030f9b79" exitCode=0 Feb 17 00:21:55 crc kubenswrapper[4791]: I0217 00:21:55.514258 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa","Type":"ContainerDied","Data":"dc5c21e29377746db37307f253136c118eeb822bad13d9daca12b013030f9b79"} Feb 17 00:21:56 crc kubenswrapper[4791]: I0217 00:21:56.520825 4791 generic.go:334] "Generic (PLEG): container finished" podID="eaaeddf5-72cd-46f1-b62e-83bf81db9dfa" containerID="12bceec9da4fba16bd41634c80e6059d37799ddeaa3084515a4774bbe05b75ca" exitCode=0 Feb 17 00:21:56 crc kubenswrapper[4791]: I0217 00:21:56.520873 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa","Type":"ContainerDied","Data":"12bceec9da4fba16bd41634c80e6059d37799ddeaa3084515a4774bbe05b75ca"} Feb 17 00:21:56 crc kubenswrapper[4791]: I0217 00:21:56.564515 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-2-build_eaaeddf5-72cd-46f1-b62e-83bf81db9dfa/manage-dockerfile/0.log" Feb 17 00:21:57 crc kubenswrapper[4791]: I0217 00:21:57.530394 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa","Type":"ContainerStarted","Data":"a2bfd7a1bbc9055b744243dfb16479f7bfe71f03549db82ec9e06b1a48794e06"} Feb 17 00:21:57 crc kubenswrapper[4791]: I0217 00:21:57.564644 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-core-2-build" podStartSLOduration=5.564618831 podStartE2EDuration="5.564618831s" podCreationTimestamp="2026-02-17 00:21:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:21:57.561274748 +0000 UTC m=+975.040787315" watchObservedRunningTime="2026-02-17 00:21:57.564618831 +0000 UTC m=+975.044131358" Feb 17 00:22:24 crc kubenswrapper[4791]: I0217 00:22:24.973548 4791 patch_prober.go:28] interesting pod/machine-config-daemon-9klkw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 00:22:24 crc kubenswrapper[4791]: I0217 00:22:24.974288 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 00:22:54 crc kubenswrapper[4791]: I0217 00:22:54.973585 4791 patch_prober.go:28] interesting pod/machine-config-daemon-9klkw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 00:22:54 crc kubenswrapper[4791]: I0217 00:22:54.975400 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 00:23:24 crc kubenswrapper[4791]: I0217 00:23:24.973177 4791 patch_prober.go:28] interesting pod/machine-config-daemon-9klkw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 00:23:24 crc kubenswrapper[4791]: I0217 00:23:24.973702 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 00:23:24 crc kubenswrapper[4791]: I0217 00:23:24.973753 4791 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" Feb 17 00:23:24 crc kubenswrapper[4791]: I0217 00:23:24.974355 4791 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"722b3f07ba9ee7144e67186af8cb544101e6caf0a3dcec49ef92e61b07ce2b64"} pod="openshift-machine-config-operator/machine-config-daemon-9klkw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 00:23:24 crc kubenswrapper[4791]: I0217 00:23:24.974409 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" containerID="cri-o://722b3f07ba9ee7144e67186af8cb544101e6caf0a3dcec49ef92e61b07ce2b64" gracePeriod=600 Feb 17 00:23:25 crc kubenswrapper[4791]: I0217 00:23:25.159923 4791 generic.go:334] "Generic (PLEG): container finished" podID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerID="722b3f07ba9ee7144e67186af8cb544101e6caf0a3dcec49ef92e61b07ce2b64" exitCode=0 Feb 17 00:23:25 crc kubenswrapper[4791]: I0217 00:23:25.159996 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" event={"ID":"02a3a228-86d6-4d54-ad63-0d36c9d59af5","Type":"ContainerDied","Data":"722b3f07ba9ee7144e67186af8cb544101e6caf0a3dcec49ef92e61b07ce2b64"} Feb 17 00:23:25 crc kubenswrapper[4791]: I0217 00:23:25.160375 4791 scope.go:117] "RemoveContainer" containerID="25ad102ac8c402951283e13e1752712dd2cdbf609cf80aba767b2bf348b988eb" Feb 17 00:23:26 crc kubenswrapper[4791]: I0217 00:23:26.169762 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" event={"ID":"02a3a228-86d6-4d54-ad63-0d36c9d59af5","Type":"ContainerStarted","Data":"71eb8be49ce7a773ea08603b5bc6a4c5e5daf5e96a1784fa4c29b0f51c425ed1"} Feb 17 00:25:09 crc kubenswrapper[4791]: I0217 00:25:09.902689 4791 generic.go:334] "Generic (PLEG): container finished" podID="eaaeddf5-72cd-46f1-b62e-83bf81db9dfa" containerID="a2bfd7a1bbc9055b744243dfb16479f7bfe71f03549db82ec9e06b1a48794e06" exitCode=0 Feb 17 00:25:09 crc kubenswrapper[4791]: I0217 00:25:09.902757 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa","Type":"ContainerDied","Data":"a2bfd7a1bbc9055b744243dfb16479f7bfe71f03549db82ec9e06b1a48794e06"} Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.266188 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.347298 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-builder-dockercfg-kmssz-pull\") pod \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.347348 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-node-pullsecrets\") pod \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.347391 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-container-storage-run\") pod \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.347428 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-buildworkdir\") pod \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.347462 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-build-system-configs\") pod \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.347487 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-build-proxy-ca-bundles\") pod \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.347509 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-build-ca-bundles\") pod \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.347536 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-build-blob-cache\") pod \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.347560 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-container-storage-root\") pod \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.347546 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "eaaeddf5-72cd-46f1-b62e-83bf81db9dfa" (UID: "eaaeddf5-72cd-46f1-b62e-83bf81db9dfa"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.347582 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kgz9\" (UniqueName: \"kubernetes.io/projected/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-kube-api-access-9kgz9\") pod \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.347699 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-buildcachedir\") pod \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.347789 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-builder-dockercfg-kmssz-push\") pod \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.348481 4791 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.348497 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "eaaeddf5-72cd-46f1-b62e-83bf81db9dfa" (UID: "eaaeddf5-72cd-46f1-b62e-83bf81db9dfa"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.349175 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "eaaeddf5-72cd-46f1-b62e-83bf81db9dfa" (UID: "eaaeddf5-72cd-46f1-b62e-83bf81db9dfa"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.349210 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "eaaeddf5-72cd-46f1-b62e-83bf81db9dfa" (UID: "eaaeddf5-72cd-46f1-b62e-83bf81db9dfa"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.352872 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "eaaeddf5-72cd-46f1-b62e-83bf81db9dfa" (UID: "eaaeddf5-72cd-46f1-b62e-83bf81db9dfa"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.354419 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "eaaeddf5-72cd-46f1-b62e-83bf81db9dfa" (UID: "eaaeddf5-72cd-46f1-b62e-83bf81db9dfa"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.361574 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-builder-dockercfg-kmssz-pull" (OuterVolumeSpecName: "builder-dockercfg-kmssz-pull") pod "eaaeddf5-72cd-46f1-b62e-83bf81db9dfa" (UID: "eaaeddf5-72cd-46f1-b62e-83bf81db9dfa"). InnerVolumeSpecName "builder-dockercfg-kmssz-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.361619 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-kube-api-access-9kgz9" (OuterVolumeSpecName: "kube-api-access-9kgz9") pod "eaaeddf5-72cd-46f1-b62e-83bf81db9dfa" (UID: "eaaeddf5-72cd-46f1-b62e-83bf81db9dfa"). InnerVolumeSpecName "kube-api-access-9kgz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.361875 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-builder-dockercfg-kmssz-push" (OuterVolumeSpecName: "builder-dockercfg-kmssz-push") pod "eaaeddf5-72cd-46f1-b62e-83bf81db9dfa" (UID: "eaaeddf5-72cd-46f1-b62e-83bf81db9dfa"). InnerVolumeSpecName "builder-dockercfg-kmssz-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.370294 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "eaaeddf5-72cd-46f1-b62e-83bf81db9dfa" (UID: "eaaeddf5-72cd-46f1-b62e-83bf81db9dfa"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.451479 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-builder-dockercfg-kmssz-pull\") on node \"crc\" DevicePath \"\"" Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.451898 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-container-storage-run\") on node \"crc\" DevicePath \"\"" Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.452082 4791 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-buildworkdir\") on node \"crc\" DevicePath \"\"" Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.452316 4791 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-build-system-configs\") on node \"crc\" DevicePath \"\"" Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.452485 4791 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.452658 4791 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.452867 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9kgz9\" (UniqueName: \"kubernetes.io/projected/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-kube-api-access-9kgz9\") on node \"crc\" DevicePath \"\"" Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.453058 4791 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-buildcachedir\") on node \"crc\" DevicePath \"\"" Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.453272 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-builder-dockercfg-kmssz-push\") on node \"crc\" DevicePath \"\"" Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.719484 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "eaaeddf5-72cd-46f1-b62e-83bf81db9dfa" (UID: "eaaeddf5-72cd-46f1-b62e-83bf81db9dfa"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.758051 4791 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-build-blob-cache\") on node \"crc\" DevicePath \"\"" Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.920941 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa","Type":"ContainerDied","Data":"95153ff0c73c10588f012d75771a6f42fab1cc01a858b352d5f294ae0d3f9091"} Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.920996 4791 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95153ff0c73c10588f012d75771a6f42fab1cc01a858b352d5f294ae0d3f9091" Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.921045 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Feb 17 00:25:14 crc kubenswrapper[4791]: I0217 00:25:14.640658 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "eaaeddf5-72cd-46f1-b62e-83bf81db9dfa" (UID: "eaaeddf5-72cd-46f1-b62e-83bf81db9dfa"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:25:14 crc kubenswrapper[4791]: I0217 00:25:14.710191 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-container-storage-root\") on node \"crc\" DevicePath \"\"" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.673437 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-bridge-1-build"] Feb 17 00:25:15 crc kubenswrapper[4791]: E0217 00:25:15.673803 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaaeddf5-72cd-46f1-b62e-83bf81db9dfa" containerName="manage-dockerfile" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.673826 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaaeddf5-72cd-46f1-b62e-83bf81db9dfa" containerName="manage-dockerfile" Feb 17 00:25:15 crc kubenswrapper[4791]: E0217 00:25:15.673842 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaaeddf5-72cd-46f1-b62e-83bf81db9dfa" containerName="docker-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.673854 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaaeddf5-72cd-46f1-b62e-83bf81db9dfa" containerName="docker-build" Feb 17 00:25:15 crc kubenswrapper[4791]: E0217 00:25:15.673880 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaaeddf5-72cd-46f1-b62e-83bf81db9dfa" containerName="git-clone" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.673893 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaaeddf5-72cd-46f1-b62e-83bf81db9dfa" containerName="git-clone" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.674069 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaaeddf5-72cd-46f1-b62e-83bf81db9dfa" containerName="docker-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.675175 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.680513 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-ca" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.681839 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-global-ca" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.682807 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-kmssz" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.685168 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-sys-config" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.713965 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.832091 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.832187 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.832224 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.832444 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-builder-dockercfg-kmssz-push\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.832539 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.832721 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.832785 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-builder-dockercfg-kmssz-pull\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.832831 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.832879 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nd9fg\" (UniqueName: \"kubernetes.io/projected/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-kube-api-access-nd9fg\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.832959 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.833085 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.833289 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.934327 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.934375 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.934408 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.934430 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.934449 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.934481 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-builder-dockercfg-kmssz-push\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.934518 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.934588 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.934672 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.934618 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-builder-dockercfg-kmssz-pull\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.934814 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.934840 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nd9fg\" (UniqueName: \"kubernetes.io/projected/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-kube-api-access-nd9fg\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.934870 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.935304 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.934584 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.935568 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.935776 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.936167 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.936499 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.936878 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.937493 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.944941 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-builder-dockercfg-kmssz-push\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.946207 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-builder-dockercfg-kmssz-pull\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.972766 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nd9fg\" (UniqueName: \"kubernetes.io/projected/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-kube-api-access-nd9fg\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:16 crc kubenswrapper[4791]: I0217 00:25:16.007323 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:16 crc kubenswrapper[4791]: I0217 00:25:16.269035 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Feb 17 00:25:16 crc kubenswrapper[4791]: I0217 00:25:16.965951 4791 generic.go:334] "Generic (PLEG): container finished" podID="a118b8d1-a620-44fd-9e07-8fbb0e78bcf5" containerID="9e535bde2286bdb920242dd34aac2ed1fa15b9ef0478082f22083bc8a2746eb5" exitCode=0 Feb 17 00:25:16 crc kubenswrapper[4791]: I0217 00:25:16.966142 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5","Type":"ContainerDied","Data":"9e535bde2286bdb920242dd34aac2ed1fa15b9ef0478082f22083bc8a2746eb5"} Feb 17 00:25:16 crc kubenswrapper[4791]: I0217 00:25:16.966286 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5","Type":"ContainerStarted","Data":"a8ad63dc01e765b914019b189cc098bd7c13ee5b3d0bff5696bd4f8ebabdb868"} Feb 17 00:25:17 crc kubenswrapper[4791]: I0217 00:25:17.980645 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5","Type":"ContainerStarted","Data":"78c57b34e25a65d194222b72515febb984062deed4a8a18a2de643bedb2b7636"} Feb 17 00:25:18 crc kubenswrapper[4791]: I0217 00:25:18.021487 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-bridge-1-build" podStartSLOduration=3.021452435 podStartE2EDuration="3.021452435s" podCreationTimestamp="2026-02-17 00:25:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:25:18.01031677 +0000 UTC m=+1175.489829307" watchObservedRunningTime="2026-02-17 00:25:18.021452435 +0000 UTC m=+1175.500965002" Feb 17 00:25:25 crc kubenswrapper[4791]: I0217 00:25:25.999023 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.000264 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/sg-bridge-1-build" podUID="a118b8d1-a620-44fd-9e07-8fbb0e78bcf5" containerName="docker-build" containerID="cri-o://78c57b34e25a65d194222b72515febb984062deed4a8a18a2de643bedb2b7636" gracePeriod=30 Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.049200 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_a118b8d1-a620-44fd-9e07-8fbb0e78bcf5/docker-build/0.log" Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.050101 4791 generic.go:334] "Generic (PLEG): container finished" podID="a118b8d1-a620-44fd-9e07-8fbb0e78bcf5" containerID="78c57b34e25a65d194222b72515febb984062deed4a8a18a2de643bedb2b7636" exitCode=1 Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.050200 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5","Type":"ContainerDied","Data":"78c57b34e25a65d194222b72515febb984062deed4a8a18a2de643bedb2b7636"} Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.342529 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_a118b8d1-a620-44fd-9e07-8fbb0e78bcf5/docker-build/0.log" Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.343227 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.485966 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-node-pullsecrets\") pod \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.486075 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-buildcachedir\") pod \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.486087 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "a118b8d1-a620-44fd-9e07-8fbb0e78bcf5" (UID: "a118b8d1-a620-44fd-9e07-8fbb0e78bcf5"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.486141 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-builder-dockercfg-kmssz-pull\") pod \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.486183 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "a118b8d1-a620-44fd-9e07-8fbb0e78bcf5" (UID: "a118b8d1-a620-44fd-9e07-8fbb0e78bcf5"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.486190 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-build-blob-cache\") pod \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.486267 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-build-proxy-ca-bundles\") pod \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.486325 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-builder-dockercfg-kmssz-push\") pod \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.486364 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-buildworkdir\") pod \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.486404 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-container-storage-root\") pod \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.486430 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-build-system-configs\") pod \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.486455 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nd9fg\" (UniqueName: \"kubernetes.io/projected/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-kube-api-access-nd9fg\") pod \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.486491 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-container-storage-run\") pod \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.486521 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-build-ca-bundles\") pod \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.486974 4791 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.486994 4791 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-buildcachedir\") on node \"crc\" DevicePath \"\"" Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.487027 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "a118b8d1-a620-44fd-9e07-8fbb0e78bcf5" (UID: "a118b8d1-a620-44fd-9e07-8fbb0e78bcf5"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.488153 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "a118b8d1-a620-44fd-9e07-8fbb0e78bcf5" (UID: "a118b8d1-a620-44fd-9e07-8fbb0e78bcf5"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.488487 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "a118b8d1-a620-44fd-9e07-8fbb0e78bcf5" (UID: "a118b8d1-a620-44fd-9e07-8fbb0e78bcf5"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.488710 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "a118b8d1-a620-44fd-9e07-8fbb0e78bcf5" (UID: "a118b8d1-a620-44fd-9e07-8fbb0e78bcf5"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.487623 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "a118b8d1-a620-44fd-9e07-8fbb0e78bcf5" (UID: "a118b8d1-a620-44fd-9e07-8fbb0e78bcf5"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.492416 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-builder-dockercfg-kmssz-pull" (OuterVolumeSpecName: "builder-dockercfg-kmssz-pull") pod "a118b8d1-a620-44fd-9e07-8fbb0e78bcf5" (UID: "a118b8d1-a620-44fd-9e07-8fbb0e78bcf5"). InnerVolumeSpecName "builder-dockercfg-kmssz-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.493260 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-builder-dockercfg-kmssz-push" (OuterVolumeSpecName: "builder-dockercfg-kmssz-push") pod "a118b8d1-a620-44fd-9e07-8fbb0e78bcf5" (UID: "a118b8d1-a620-44fd-9e07-8fbb0e78bcf5"). InnerVolumeSpecName "builder-dockercfg-kmssz-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.494525 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-kube-api-access-nd9fg" (OuterVolumeSpecName: "kube-api-access-nd9fg") pod "a118b8d1-a620-44fd-9e07-8fbb0e78bcf5" (UID: "a118b8d1-a620-44fd-9e07-8fbb0e78bcf5"). InnerVolumeSpecName "kube-api-access-nd9fg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.567204 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "a118b8d1-a620-44fd-9e07-8fbb0e78bcf5" (UID: "a118b8d1-a620-44fd-9e07-8fbb0e78bcf5"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.588554 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-builder-dockercfg-kmssz-pull\") on node \"crc\" DevicePath \"\"" Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.588580 4791 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-build-blob-cache\") on node \"crc\" DevicePath \"\"" Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.588591 4791 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.588599 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-builder-dockercfg-kmssz-push\") on node \"crc\" DevicePath \"\"" Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.588608 4791 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-buildworkdir\") on node \"crc\" DevicePath \"\"" Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.588616 4791 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-build-system-configs\") on node \"crc\" DevicePath \"\"" Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.588624 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nd9fg\" (UniqueName: \"kubernetes.io/projected/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-kube-api-access-nd9fg\") on node \"crc\" DevicePath \"\"" Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.588632 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-container-storage-run\") on node \"crc\" DevicePath \"\"" Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.588640 4791 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.046488 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "a118b8d1-a620-44fd-9e07-8fbb0e78bcf5" (UID: "a118b8d1-a620-44fd-9e07-8fbb0e78bcf5"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.059847 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_a118b8d1-a620-44fd-9e07-8fbb0e78bcf5/docker-build/0.log" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.060511 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5","Type":"ContainerDied","Data":"a8ad63dc01e765b914019b189cc098bd7c13ee5b3d0bff5696bd4f8ebabdb868"} Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.060619 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.060634 4791 scope.go:117] "RemoveContainer" containerID="78c57b34e25a65d194222b72515febb984062deed4a8a18a2de643bedb2b7636" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.092999 4791 scope.go:117] "RemoveContainer" containerID="9e535bde2286bdb920242dd34aac2ed1fa15b9ef0478082f22083bc8a2746eb5" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.100741 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-container-storage-root\") on node \"crc\" DevicePath \"\"" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.117793 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.123582 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.235149 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a118b8d1-a620-44fd-9e07-8fbb0e78bcf5" path="/var/lib/kubelet/pods/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5/volumes" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.709453 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-bridge-2-build"] Feb 17 00:25:27 crc kubenswrapper[4791]: E0217 00:25:27.710342 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a118b8d1-a620-44fd-9e07-8fbb0e78bcf5" containerName="manage-dockerfile" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.710457 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="a118b8d1-a620-44fd-9e07-8fbb0e78bcf5" containerName="manage-dockerfile" Feb 17 00:25:27 crc kubenswrapper[4791]: E0217 00:25:27.710580 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a118b8d1-a620-44fd-9e07-8fbb0e78bcf5" containerName="docker-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.710657 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="a118b8d1-a620-44fd-9e07-8fbb0e78bcf5" containerName="docker-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.710849 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="a118b8d1-a620-44fd-9e07-8fbb0e78bcf5" containerName="docker-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.712151 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.714402 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-kmssz" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.714438 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-sys-config" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.714825 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-global-ca" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.715077 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-ca" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.749689 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-2-build"] Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.810949 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6809eb42-95c9-4cb7-b793-c4e855bd8f29-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.811041 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6809eb42-95c9-4cb7-b793-c4e855bd8f29-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.811205 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/6809eb42-95c9-4cb7-b793-c4e855bd8f29-builder-dockercfg-kmssz-push\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.811266 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6809eb42-95c9-4cb7-b793-c4e855bd8f29-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.811303 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6809eb42-95c9-4cb7-b793-c4e855bd8f29-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.811336 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6809eb42-95c9-4cb7-b793-c4e855bd8f29-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.811383 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjnh6\" (UniqueName: \"kubernetes.io/projected/6809eb42-95c9-4cb7-b793-c4e855bd8f29-kube-api-access-rjnh6\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.811418 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6809eb42-95c9-4cb7-b793-c4e855bd8f29-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.811503 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/6809eb42-95c9-4cb7-b793-c4e855bd8f29-builder-dockercfg-kmssz-pull\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.811546 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6809eb42-95c9-4cb7-b793-c4e855bd8f29-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.811587 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6809eb42-95c9-4cb7-b793-c4e855bd8f29-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.811629 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6809eb42-95c9-4cb7-b793-c4e855bd8f29-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.912892 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/6809eb42-95c9-4cb7-b793-c4e855bd8f29-builder-dockercfg-kmssz-push\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.913233 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6809eb42-95c9-4cb7-b793-c4e855bd8f29-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.913288 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6809eb42-95c9-4cb7-b793-c4e855bd8f29-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.913330 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6809eb42-95c9-4cb7-b793-c4e855bd8f29-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.913397 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjnh6\" (UniqueName: \"kubernetes.io/projected/6809eb42-95c9-4cb7-b793-c4e855bd8f29-kube-api-access-rjnh6\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.913399 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6809eb42-95c9-4cb7-b793-c4e855bd8f29-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.913448 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6809eb42-95c9-4cb7-b793-c4e855bd8f29-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.913508 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/6809eb42-95c9-4cb7-b793-c4e855bd8f29-builder-dockercfg-kmssz-pull\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.913553 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6809eb42-95c9-4cb7-b793-c4e855bd8f29-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.913609 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6809eb42-95c9-4cb7-b793-c4e855bd8f29-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.913670 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6809eb42-95c9-4cb7-b793-c4e855bd8f29-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.913728 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6809eb42-95c9-4cb7-b793-c4e855bd8f29-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.913813 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6809eb42-95c9-4cb7-b793-c4e855bd8f29-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.914387 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6809eb42-95c9-4cb7-b793-c4e855bd8f29-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.914675 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6809eb42-95c9-4cb7-b793-c4e855bd8f29-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.914794 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6809eb42-95c9-4cb7-b793-c4e855bd8f29-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.914941 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6809eb42-95c9-4cb7-b793-c4e855bd8f29-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.914963 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6809eb42-95c9-4cb7-b793-c4e855bd8f29-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.915601 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6809eb42-95c9-4cb7-b793-c4e855bd8f29-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.916090 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6809eb42-95c9-4cb7-b793-c4e855bd8f29-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.916316 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6809eb42-95c9-4cb7-b793-c4e855bd8f29-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.922605 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/6809eb42-95c9-4cb7-b793-c4e855bd8f29-builder-dockercfg-kmssz-push\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.922669 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/6809eb42-95c9-4cb7-b793-c4e855bd8f29-builder-dockercfg-kmssz-pull\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.944356 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjnh6\" (UniqueName: \"kubernetes.io/projected/6809eb42-95c9-4cb7-b793-c4e855bd8f29-kube-api-access-rjnh6\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:28 crc kubenswrapper[4791]: I0217 00:25:28.029866 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:28 crc kubenswrapper[4791]: I0217 00:25:28.297714 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-2-build"] Feb 17 00:25:29 crc kubenswrapper[4791]: I0217 00:25:29.090190 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"6809eb42-95c9-4cb7-b793-c4e855bd8f29","Type":"ContainerStarted","Data":"5ed6c7362353b66ccb65fbc96862f8cfe04c1a0062529c74c629bf54b700a947"} Feb 17 00:25:29 crc kubenswrapper[4791]: I0217 00:25:29.090253 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"6809eb42-95c9-4cb7-b793-c4e855bd8f29","Type":"ContainerStarted","Data":"959ba61d0afc9ccfd7a3013d85270690dccd549a55f01c5072900e4cab2d75f9"} Feb 17 00:25:30 crc kubenswrapper[4791]: I0217 00:25:30.099876 4791 generic.go:334] "Generic (PLEG): container finished" podID="6809eb42-95c9-4cb7-b793-c4e855bd8f29" containerID="5ed6c7362353b66ccb65fbc96862f8cfe04c1a0062529c74c629bf54b700a947" exitCode=0 Feb 17 00:25:30 crc kubenswrapper[4791]: I0217 00:25:30.099952 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"6809eb42-95c9-4cb7-b793-c4e855bd8f29","Type":"ContainerDied","Data":"5ed6c7362353b66ccb65fbc96862f8cfe04c1a0062529c74c629bf54b700a947"} Feb 17 00:25:31 crc kubenswrapper[4791]: I0217 00:25:31.110902 4791 generic.go:334] "Generic (PLEG): container finished" podID="6809eb42-95c9-4cb7-b793-c4e855bd8f29" containerID="54acd77816ddad2b49a1d79681a0cc4bd31839782e4e81051562044e5f869d33" exitCode=0 Feb 17 00:25:31 crc kubenswrapper[4791]: I0217 00:25:31.110977 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"6809eb42-95c9-4cb7-b793-c4e855bd8f29","Type":"ContainerDied","Data":"54acd77816ddad2b49a1d79681a0cc4bd31839782e4e81051562044e5f869d33"} Feb 17 00:25:31 crc kubenswrapper[4791]: I0217 00:25:31.171533 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-2-build_6809eb42-95c9-4cb7-b793-c4e855bd8f29/manage-dockerfile/0.log" Feb 17 00:25:32 crc kubenswrapper[4791]: I0217 00:25:32.122873 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"6809eb42-95c9-4cb7-b793-c4e855bd8f29","Type":"ContainerStarted","Data":"6dbd63f68048cb7fd58a16ab0b938520a22f44df21885c80133588df55d80571"} Feb 17 00:25:54 crc kubenswrapper[4791]: I0217 00:25:54.973583 4791 patch_prober.go:28] interesting pod/machine-config-daemon-9klkw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 00:25:54 crc kubenswrapper[4791]: I0217 00:25:54.974239 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 00:26:16 crc kubenswrapper[4791]: I0217 00:26:16.467161 4791 generic.go:334] "Generic (PLEG): container finished" podID="6809eb42-95c9-4cb7-b793-c4e855bd8f29" containerID="6dbd63f68048cb7fd58a16ab0b938520a22f44df21885c80133588df55d80571" exitCode=0 Feb 17 00:26:16 crc kubenswrapper[4791]: I0217 00:26:16.467278 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"6809eb42-95c9-4cb7-b793-c4e855bd8f29","Type":"ContainerDied","Data":"6dbd63f68048cb7fd58a16ab0b938520a22f44df21885c80133588df55d80571"} Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.756923 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.844248 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6809eb42-95c9-4cb7-b793-c4e855bd8f29-container-storage-root\") pod \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.844343 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/6809eb42-95c9-4cb7-b793-c4e855bd8f29-builder-dockercfg-kmssz-pull\") pod \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.844400 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/6809eb42-95c9-4cb7-b793-c4e855bd8f29-builder-dockercfg-kmssz-push\") pod \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.844427 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6809eb42-95c9-4cb7-b793-c4e855bd8f29-build-proxy-ca-bundles\") pod \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.844984 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6809eb42-95c9-4cb7-b793-c4e855bd8f29-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "6809eb42-95c9-4cb7-b793-c4e855bd8f29" (UID: "6809eb42-95c9-4cb7-b793-c4e855bd8f29"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.845309 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6809eb42-95c9-4cb7-b793-c4e855bd8f29-node-pullsecrets\") pod \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.845342 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6809eb42-95c9-4cb7-b793-c4e855bd8f29-build-blob-cache\") pod \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.845361 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjnh6\" (UniqueName: \"kubernetes.io/projected/6809eb42-95c9-4cb7-b793-c4e855bd8f29-kube-api-access-rjnh6\") pod \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.845381 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6809eb42-95c9-4cb7-b793-c4e855bd8f29-buildworkdir\") pod \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.845432 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6809eb42-95c9-4cb7-b793-c4e855bd8f29-container-storage-run\") pod \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.845456 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6809eb42-95c9-4cb7-b793-c4e855bd8f29-build-system-configs\") pod \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.845496 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6809eb42-95c9-4cb7-b793-c4e855bd8f29-buildcachedir\") pod \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.845519 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6809eb42-95c9-4cb7-b793-c4e855bd8f29-build-ca-bundles\") pod \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.845772 4791 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6809eb42-95c9-4cb7-b793-c4e855bd8f29-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.845377 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6809eb42-95c9-4cb7-b793-c4e855bd8f29-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "6809eb42-95c9-4cb7-b793-c4e855bd8f29" (UID: "6809eb42-95c9-4cb7-b793-c4e855bd8f29"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.846290 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6809eb42-95c9-4cb7-b793-c4e855bd8f29-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "6809eb42-95c9-4cb7-b793-c4e855bd8f29" (UID: "6809eb42-95c9-4cb7-b793-c4e855bd8f29"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.846337 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6809eb42-95c9-4cb7-b793-c4e855bd8f29-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "6809eb42-95c9-4cb7-b793-c4e855bd8f29" (UID: "6809eb42-95c9-4cb7-b793-c4e855bd8f29"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.846725 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6809eb42-95c9-4cb7-b793-c4e855bd8f29-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "6809eb42-95c9-4cb7-b793-c4e855bd8f29" (UID: "6809eb42-95c9-4cb7-b793-c4e855bd8f29"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.846915 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6809eb42-95c9-4cb7-b793-c4e855bd8f29-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "6809eb42-95c9-4cb7-b793-c4e855bd8f29" (UID: "6809eb42-95c9-4cb7-b793-c4e855bd8f29"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.847433 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6809eb42-95c9-4cb7-b793-c4e855bd8f29-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "6809eb42-95c9-4cb7-b793-c4e855bd8f29" (UID: "6809eb42-95c9-4cb7-b793-c4e855bd8f29"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.849707 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6809eb42-95c9-4cb7-b793-c4e855bd8f29-builder-dockercfg-kmssz-push" (OuterVolumeSpecName: "builder-dockercfg-kmssz-push") pod "6809eb42-95c9-4cb7-b793-c4e855bd8f29" (UID: "6809eb42-95c9-4cb7-b793-c4e855bd8f29"). InnerVolumeSpecName "builder-dockercfg-kmssz-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.850034 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6809eb42-95c9-4cb7-b793-c4e855bd8f29-builder-dockercfg-kmssz-pull" (OuterVolumeSpecName: "builder-dockercfg-kmssz-pull") pod "6809eb42-95c9-4cb7-b793-c4e855bd8f29" (UID: "6809eb42-95c9-4cb7-b793-c4e855bd8f29"). InnerVolumeSpecName "builder-dockercfg-kmssz-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.850176 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6809eb42-95c9-4cb7-b793-c4e855bd8f29-kube-api-access-rjnh6" (OuterVolumeSpecName: "kube-api-access-rjnh6") pod "6809eb42-95c9-4cb7-b793-c4e855bd8f29" (UID: "6809eb42-95c9-4cb7-b793-c4e855bd8f29"). InnerVolumeSpecName "kube-api-access-rjnh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.946450 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6809eb42-95c9-4cb7-b793-c4e855bd8f29-container-storage-run\") on node \"crc\" DevicePath \"\"" Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.946481 4791 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6809eb42-95c9-4cb7-b793-c4e855bd8f29-build-system-configs\") on node \"crc\" DevicePath \"\"" Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.946490 4791 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6809eb42-95c9-4cb7-b793-c4e855bd8f29-buildcachedir\") on node \"crc\" DevicePath \"\"" Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.946498 4791 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6809eb42-95c9-4cb7-b793-c4e855bd8f29-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.946506 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/6809eb42-95c9-4cb7-b793-c4e855bd8f29-builder-dockercfg-kmssz-pull\") on node \"crc\" DevicePath \"\"" Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.946515 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/6809eb42-95c9-4cb7-b793-c4e855bd8f29-builder-dockercfg-kmssz-push\") on node \"crc\" DevicePath \"\"" Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.946523 4791 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6809eb42-95c9-4cb7-b793-c4e855bd8f29-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.946532 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjnh6\" (UniqueName: \"kubernetes.io/projected/6809eb42-95c9-4cb7-b793-c4e855bd8f29-kube-api-access-rjnh6\") on node \"crc\" DevicePath \"\"" Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.946542 4791 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6809eb42-95c9-4cb7-b793-c4e855bd8f29-buildworkdir\") on node \"crc\" DevicePath \"\"" Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.963597 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6809eb42-95c9-4cb7-b793-c4e855bd8f29-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "6809eb42-95c9-4cb7-b793-c4e855bd8f29" (UID: "6809eb42-95c9-4cb7-b793-c4e855bd8f29"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:26:18 crc kubenswrapper[4791]: I0217 00:26:18.047259 4791 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6809eb42-95c9-4cb7-b793-c4e855bd8f29-build-blob-cache\") on node \"crc\" DevicePath \"\"" Feb 17 00:26:18 crc kubenswrapper[4791]: I0217 00:26:18.484805 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"6809eb42-95c9-4cb7-b793-c4e855bd8f29","Type":"ContainerDied","Data":"959ba61d0afc9ccfd7a3013d85270690dccd549a55f01c5072900e4cab2d75f9"} Feb 17 00:26:18 crc kubenswrapper[4791]: I0217 00:26:18.484850 4791 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="959ba61d0afc9ccfd7a3013d85270690dccd549a55f01c5072900e4cab2d75f9" Feb 17 00:26:18 crc kubenswrapper[4791]: I0217 00:26:18.484855 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Feb 17 00:26:18 crc kubenswrapper[4791]: I0217 00:26:18.513085 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6809eb42-95c9-4cb7-b793-c4e855bd8f29-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "6809eb42-95c9-4cb7-b793-c4e855bd8f29" (UID: "6809eb42-95c9-4cb7-b793-c4e855bd8f29"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:26:18 crc kubenswrapper[4791]: I0217 00:26:18.552955 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6809eb42-95c9-4cb7-b793-c4e855bd8f29-container-storage-root\") on node \"crc\" DevicePath \"\"" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.130737 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Feb 17 00:26:22 crc kubenswrapper[4791]: E0217 00:26:22.131610 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6809eb42-95c9-4cb7-b793-c4e855bd8f29" containerName="docker-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.131642 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="6809eb42-95c9-4cb7-b793-c4e855bd8f29" containerName="docker-build" Feb 17 00:26:22 crc kubenswrapper[4791]: E0217 00:26:22.131670 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6809eb42-95c9-4cb7-b793-c4e855bd8f29" containerName="manage-dockerfile" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.131687 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="6809eb42-95c9-4cb7-b793-c4e855bd8f29" containerName="manage-dockerfile" Feb 17 00:26:22 crc kubenswrapper[4791]: E0217 00:26:22.131723 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6809eb42-95c9-4cb7-b793-c4e855bd8f29" containerName="git-clone" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.131740 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="6809eb42-95c9-4cb7-b793-c4e855bd8f29" containerName="git-clone" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.132014 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="6809eb42-95c9-4cb7-b793-c4e855bd8f29" containerName="docker-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.133478 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.135665 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-global-ca" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.135718 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-kmssz" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.136143 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-ca" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.136147 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-sys-config" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.145748 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.303805 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/4859167c-9cba-498e-85e0-25710c5c93ec-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.303899 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/4859167c-9cba-498e-85e0-25710c5c93ec-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.303969 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-675m6\" (UniqueName: \"kubernetes.io/projected/4859167c-9cba-498e-85e0-25710c5c93ec-kube-api-access-675m6\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.304016 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/4859167c-9cba-498e-85e0-25710c5c93ec-builder-dockercfg-kmssz-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.304073 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4859167c-9cba-498e-85e0-25710c5c93ec-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.304134 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/4859167c-9cba-498e-85e0-25710c5c93ec-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.304315 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4859167c-9cba-498e-85e0-25710c5c93ec-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.304392 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/4859167c-9cba-498e-85e0-25710c5c93ec-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.304444 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/4859167c-9cba-498e-85e0-25710c5c93ec-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.304565 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/4859167c-9cba-498e-85e0-25710c5c93ec-builder-dockercfg-kmssz-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.304602 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/4859167c-9cba-498e-85e0-25710c5c93ec-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.304662 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4859167c-9cba-498e-85e0-25710c5c93ec-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.405465 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4859167c-9cba-498e-85e0-25710c5c93ec-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.405533 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/4859167c-9cba-498e-85e0-25710c5c93ec-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.405575 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/4859167c-9cba-498e-85e0-25710c5c93ec-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.405617 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/4859167c-9cba-498e-85e0-25710c5c93ec-builder-dockercfg-kmssz-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.405647 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/4859167c-9cba-498e-85e0-25710c5c93ec-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.405699 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4859167c-9cba-498e-85e0-25710c5c93ec-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.405734 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/4859167c-9cba-498e-85e0-25710c5c93ec-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.405786 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/4859167c-9cba-498e-85e0-25710c5c93ec-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.405815 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-675m6\" (UniqueName: \"kubernetes.io/projected/4859167c-9cba-498e-85e0-25710c5c93ec-kube-api-access-675m6\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.405839 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/4859167c-9cba-498e-85e0-25710c5c93ec-builder-dockercfg-kmssz-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.405872 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4859167c-9cba-498e-85e0-25710c5c93ec-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.405894 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/4859167c-9cba-498e-85e0-25710c5c93ec-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.406073 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4859167c-9cba-498e-85e0-25710c5c93ec-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.405974 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/4859167c-9cba-498e-85e0-25710c5c93ec-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.406276 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/4859167c-9cba-498e-85e0-25710c5c93ec-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.406643 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4859167c-9cba-498e-85e0-25710c5c93ec-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.406689 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/4859167c-9cba-498e-85e0-25710c5c93ec-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.407301 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4859167c-9cba-498e-85e0-25710c5c93ec-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.407388 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/4859167c-9cba-498e-85e0-25710c5c93ec-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.407427 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/4859167c-9cba-498e-85e0-25710c5c93ec-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.407910 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/4859167c-9cba-498e-85e0-25710c5c93ec-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.413466 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/4859167c-9cba-498e-85e0-25710c5c93ec-builder-dockercfg-kmssz-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.423349 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/4859167c-9cba-498e-85e0-25710c5c93ec-builder-dockercfg-kmssz-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.447835 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-675m6\" (UniqueName: \"kubernetes.io/projected/4859167c-9cba-498e-85e0-25710c5c93ec-kube-api-access-675m6\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.449446 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.920169 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Feb 17 00:26:23 crc kubenswrapper[4791]: I0217 00:26:23.523695 4791 generic.go:334] "Generic (PLEG): container finished" podID="4859167c-9cba-498e-85e0-25710c5c93ec" containerID="483a1bcca76e3db10784b97f52ee57ecba522e10db6fa16777a7e1634ccf2a33" exitCode=0 Feb 17 00:26:23 crc kubenswrapper[4791]: I0217 00:26:23.523793 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"4859167c-9cba-498e-85e0-25710c5c93ec","Type":"ContainerDied","Data":"483a1bcca76e3db10784b97f52ee57ecba522e10db6fa16777a7e1634ccf2a33"} Feb 17 00:26:23 crc kubenswrapper[4791]: I0217 00:26:23.523944 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"4859167c-9cba-498e-85e0-25710c5c93ec","Type":"ContainerStarted","Data":"4cf7ca9231f7176ed8c8ea9c4c5efff6bf42ad35a9176f33da4b1f486678be9c"} Feb 17 00:26:24 crc kubenswrapper[4791]: I0217 00:26:24.534587 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"4859167c-9cba-498e-85e0-25710c5c93ec","Type":"ContainerStarted","Data":"59c883a9d1d29f5835ff4c76c87e8cbadfa7b2819d1c1d2c0904004095bfc1cb"} Feb 17 00:26:24 crc kubenswrapper[4791]: I0217 00:26:24.571355 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-webhook-snmp-1-build" podStartSLOduration=2.571334542 podStartE2EDuration="2.571334542s" podCreationTimestamp="2026-02-17 00:26:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:26:24.567492122 +0000 UTC m=+1242.047004659" watchObservedRunningTime="2026-02-17 00:26:24.571334542 +0000 UTC m=+1242.050847069" Feb 17 00:26:24 crc kubenswrapper[4791]: I0217 00:26:24.972826 4791 patch_prober.go:28] interesting pod/machine-config-daemon-9klkw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 00:26:24 crc kubenswrapper[4791]: I0217 00:26:24.972894 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 00:26:32 crc kubenswrapper[4791]: I0217 00:26:32.881731 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Feb 17 00:26:32 crc kubenswrapper[4791]: I0217 00:26:32.882676 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/prometheus-webhook-snmp-1-build" podUID="4859167c-9cba-498e-85e0-25710c5c93ec" containerName="docker-build" containerID="cri-o://59c883a9d1d29f5835ff4c76c87e8cbadfa7b2819d1c1d2c0904004095bfc1cb" gracePeriod=30 Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.304808 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-1-build_4859167c-9cba-498e-85e0-25710c5c93ec/docker-build/0.log" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.305770 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.356171 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4859167c-9cba-498e-85e0-25710c5c93ec-build-ca-bundles\") pod \"4859167c-9cba-498e-85e0-25710c5c93ec\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.356327 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/4859167c-9cba-498e-85e0-25710c5c93ec-builder-dockercfg-kmssz-push\") pod \"4859167c-9cba-498e-85e0-25710c5c93ec\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.356411 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/4859167c-9cba-498e-85e0-25710c5c93ec-build-system-configs\") pod \"4859167c-9cba-498e-85e0-25710c5c93ec\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.356481 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/4859167c-9cba-498e-85e0-25710c5c93ec-buildworkdir\") pod \"4859167c-9cba-498e-85e0-25710c5c93ec\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.356540 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/4859167c-9cba-498e-85e0-25710c5c93ec-builder-dockercfg-kmssz-pull\") pod \"4859167c-9cba-498e-85e0-25710c5c93ec\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.356610 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4859167c-9cba-498e-85e0-25710c5c93ec-node-pullsecrets\") pod \"4859167c-9cba-498e-85e0-25710c5c93ec\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.356665 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/4859167c-9cba-498e-85e0-25710c5c93ec-container-storage-root\") pod \"4859167c-9cba-498e-85e0-25710c5c93ec\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.356750 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/4859167c-9cba-498e-85e0-25710c5c93ec-buildcachedir\") pod \"4859167c-9cba-498e-85e0-25710c5c93ec\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.356802 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4859167c-9cba-498e-85e0-25710c5c93ec-build-proxy-ca-bundles\") pod \"4859167c-9cba-498e-85e0-25710c5c93ec\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.356854 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/4859167c-9cba-498e-85e0-25710c5c93ec-build-blob-cache\") pod \"4859167c-9cba-498e-85e0-25710c5c93ec\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.356922 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-675m6\" (UniqueName: \"kubernetes.io/projected/4859167c-9cba-498e-85e0-25710c5c93ec-kube-api-access-675m6\") pod \"4859167c-9cba-498e-85e0-25710c5c93ec\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.356996 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/4859167c-9cba-498e-85e0-25710c5c93ec-container-storage-run\") pod \"4859167c-9cba-498e-85e0-25710c5c93ec\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.357095 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4859167c-9cba-498e-85e0-25710c5c93ec-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "4859167c-9cba-498e-85e0-25710c5c93ec" (UID: "4859167c-9cba-498e-85e0-25710c5c93ec"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.357517 4791 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4859167c-9cba-498e-85e0-25710c5c93ec-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.357524 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4859167c-9cba-498e-85e0-25710c5c93ec-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "4859167c-9cba-498e-85e0-25710c5c93ec" (UID: "4859167c-9cba-498e-85e0-25710c5c93ec"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.357571 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4859167c-9cba-498e-85e0-25710c5c93ec-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "4859167c-9cba-498e-85e0-25710c5c93ec" (UID: "4859167c-9cba-498e-85e0-25710c5c93ec"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.357625 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4859167c-9cba-498e-85e0-25710c5c93ec-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "4859167c-9cba-498e-85e0-25710c5c93ec" (UID: "4859167c-9cba-498e-85e0-25710c5c93ec"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.357664 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4859167c-9cba-498e-85e0-25710c5c93ec-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "4859167c-9cba-498e-85e0-25710c5c93ec" (UID: "4859167c-9cba-498e-85e0-25710c5c93ec"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.359637 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4859167c-9cba-498e-85e0-25710c5c93ec-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "4859167c-9cba-498e-85e0-25710c5c93ec" (UID: "4859167c-9cba-498e-85e0-25710c5c93ec"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.360443 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4859167c-9cba-498e-85e0-25710c5c93ec-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "4859167c-9cba-498e-85e0-25710c5c93ec" (UID: "4859167c-9cba-498e-85e0-25710c5c93ec"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.363699 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4859167c-9cba-498e-85e0-25710c5c93ec-builder-dockercfg-kmssz-push" (OuterVolumeSpecName: "builder-dockercfg-kmssz-push") pod "4859167c-9cba-498e-85e0-25710c5c93ec" (UID: "4859167c-9cba-498e-85e0-25710c5c93ec"). InnerVolumeSpecName "builder-dockercfg-kmssz-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.366090 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4859167c-9cba-498e-85e0-25710c5c93ec-kube-api-access-675m6" (OuterVolumeSpecName: "kube-api-access-675m6") pod "4859167c-9cba-498e-85e0-25710c5c93ec" (UID: "4859167c-9cba-498e-85e0-25710c5c93ec"). InnerVolumeSpecName "kube-api-access-675m6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.369270 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4859167c-9cba-498e-85e0-25710c5c93ec-builder-dockercfg-kmssz-pull" (OuterVolumeSpecName: "builder-dockercfg-kmssz-pull") pod "4859167c-9cba-498e-85e0-25710c5c93ec" (UID: "4859167c-9cba-498e-85e0-25710c5c93ec"). InnerVolumeSpecName "builder-dockercfg-kmssz-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.453625 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4859167c-9cba-498e-85e0-25710c5c93ec-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "4859167c-9cba-498e-85e0-25710c5c93ec" (UID: "4859167c-9cba-498e-85e0-25710c5c93ec"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.459765 4791 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/4859167c-9cba-498e-85e0-25710c5c93ec-buildworkdir\") on node \"crc\" DevicePath \"\"" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.459813 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/4859167c-9cba-498e-85e0-25710c5c93ec-builder-dockercfg-kmssz-pull\") on node \"crc\" DevicePath \"\"" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.459874 4791 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/4859167c-9cba-498e-85e0-25710c5c93ec-buildcachedir\") on node \"crc\" DevicePath \"\"" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.459894 4791 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4859167c-9cba-498e-85e0-25710c5c93ec-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.459911 4791 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/4859167c-9cba-498e-85e0-25710c5c93ec-build-blob-cache\") on node \"crc\" DevicePath \"\"" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.459927 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-675m6\" (UniqueName: \"kubernetes.io/projected/4859167c-9cba-498e-85e0-25710c5c93ec-kube-api-access-675m6\") on node \"crc\" DevicePath \"\"" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.459944 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/4859167c-9cba-498e-85e0-25710c5c93ec-container-storage-run\") on node \"crc\" DevicePath \"\"" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.459960 4791 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4859167c-9cba-498e-85e0-25710c5c93ec-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.459977 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/4859167c-9cba-498e-85e0-25710c5c93ec-builder-dockercfg-kmssz-push\") on node \"crc\" DevicePath \"\"" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.459993 4791 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/4859167c-9cba-498e-85e0-25710c5c93ec-build-system-configs\") on node \"crc\" DevicePath \"\"" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.606897 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-1-build_4859167c-9cba-498e-85e0-25710c5c93ec/docker-build/0.log" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.607767 4791 generic.go:334] "Generic (PLEG): container finished" podID="4859167c-9cba-498e-85e0-25710c5c93ec" containerID="59c883a9d1d29f5835ff4c76c87e8cbadfa7b2819d1c1d2c0904004095bfc1cb" exitCode=1 Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.607815 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"4859167c-9cba-498e-85e0-25710c5c93ec","Type":"ContainerDied","Data":"59c883a9d1d29f5835ff4c76c87e8cbadfa7b2819d1c1d2c0904004095bfc1cb"} Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.607853 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"4859167c-9cba-498e-85e0-25710c5c93ec","Type":"ContainerDied","Data":"4cf7ca9231f7176ed8c8ea9c4c5efff6bf42ad35a9176f33da4b1f486678be9c"} Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.607922 4791 scope.go:117] "RemoveContainer" containerID="59c883a9d1d29f5835ff4c76c87e8cbadfa7b2819d1c1d2c0904004095bfc1cb" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.608090 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.634594 4791 scope.go:117] "RemoveContainer" containerID="483a1bcca76e3db10784b97f52ee57ecba522e10db6fa16777a7e1634ccf2a33" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.669484 4791 scope.go:117] "RemoveContainer" containerID="59c883a9d1d29f5835ff4c76c87e8cbadfa7b2819d1c1d2c0904004095bfc1cb" Feb 17 00:26:33 crc kubenswrapper[4791]: E0217 00:26:33.670346 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59c883a9d1d29f5835ff4c76c87e8cbadfa7b2819d1c1d2c0904004095bfc1cb\": container with ID starting with 59c883a9d1d29f5835ff4c76c87e8cbadfa7b2819d1c1d2c0904004095bfc1cb not found: ID does not exist" containerID="59c883a9d1d29f5835ff4c76c87e8cbadfa7b2819d1c1d2c0904004095bfc1cb" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.670419 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59c883a9d1d29f5835ff4c76c87e8cbadfa7b2819d1c1d2c0904004095bfc1cb"} err="failed to get container status \"59c883a9d1d29f5835ff4c76c87e8cbadfa7b2819d1c1d2c0904004095bfc1cb\": rpc error: code = NotFound desc = could not find container \"59c883a9d1d29f5835ff4c76c87e8cbadfa7b2819d1c1d2c0904004095bfc1cb\": container with ID starting with 59c883a9d1d29f5835ff4c76c87e8cbadfa7b2819d1c1d2c0904004095bfc1cb not found: ID does not exist" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.670458 4791 scope.go:117] "RemoveContainer" containerID="483a1bcca76e3db10784b97f52ee57ecba522e10db6fa16777a7e1634ccf2a33" Feb 17 00:26:33 crc kubenswrapper[4791]: E0217 00:26:33.670892 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"483a1bcca76e3db10784b97f52ee57ecba522e10db6fa16777a7e1634ccf2a33\": container with ID starting with 483a1bcca76e3db10784b97f52ee57ecba522e10db6fa16777a7e1634ccf2a33 not found: ID does not exist" containerID="483a1bcca76e3db10784b97f52ee57ecba522e10db6fa16777a7e1634ccf2a33" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.670959 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"483a1bcca76e3db10784b97f52ee57ecba522e10db6fa16777a7e1634ccf2a33"} err="failed to get container status \"483a1bcca76e3db10784b97f52ee57ecba522e10db6fa16777a7e1634ccf2a33\": rpc error: code = NotFound desc = could not find container \"483a1bcca76e3db10784b97f52ee57ecba522e10db6fa16777a7e1634ccf2a33\": container with ID starting with 483a1bcca76e3db10784b97f52ee57ecba522e10db6fa16777a7e1634ccf2a33 not found: ID does not exist" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.824575 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4859167c-9cba-498e-85e0-25710c5c93ec-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "4859167c-9cba-498e-85e0-25710c5c93ec" (UID: "4859167c-9cba-498e-85e0-25710c5c93ec"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.866855 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/4859167c-9cba-498e-85e0-25710c5c93ec-container-storage-root\") on node \"crc\" DevicePath \"\"" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.940732 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.947628 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.556127 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Feb 17 00:26:34 crc kubenswrapper[4791]: E0217 00:26:34.556413 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4859167c-9cba-498e-85e0-25710c5c93ec" containerName="manage-dockerfile" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.556432 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="4859167c-9cba-498e-85e0-25710c5c93ec" containerName="manage-dockerfile" Feb 17 00:26:34 crc kubenswrapper[4791]: E0217 00:26:34.556444 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4859167c-9cba-498e-85e0-25710c5c93ec" containerName="docker-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.556450 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="4859167c-9cba-498e-85e0-25710c5c93ec" containerName="docker-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.556553 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="4859167c-9cba-498e-85e0-25710c5c93ec" containerName="docker-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.557333 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.559454 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-sys-config" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.559962 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-global-ca" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.560327 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-kmssz" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.560708 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-ca" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.578724 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/df17f317-5914-40b6-bfb7-12a157eb4b95-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.578972 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/df17f317-5914-40b6-bfb7-12a157eb4b95-builder-dockercfg-kmssz-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.579125 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df17f317-5914-40b6-bfb7-12a157eb4b95-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.579235 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t7tm\" (UniqueName: \"kubernetes.io/projected/df17f317-5914-40b6-bfb7-12a157eb4b95-kube-api-access-7t7tm\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.579374 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df17f317-5914-40b6-bfb7-12a157eb4b95-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.579494 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/df17f317-5914-40b6-bfb7-12a157eb4b95-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.579601 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/df17f317-5914-40b6-bfb7-12a157eb4b95-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.579720 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/df17f317-5914-40b6-bfb7-12a157eb4b95-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.579815 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/df17f317-5914-40b6-bfb7-12a157eb4b95-builder-dockercfg-kmssz-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.579932 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/df17f317-5914-40b6-bfb7-12a157eb4b95-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.579823 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.580022 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/df17f317-5914-40b6-bfb7-12a157eb4b95-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.580265 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/df17f317-5914-40b6-bfb7-12a157eb4b95-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.681315 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/df17f317-5914-40b6-bfb7-12a157eb4b95-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.681374 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/df17f317-5914-40b6-bfb7-12a157eb4b95-builder-dockercfg-kmssz-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.681409 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df17f317-5914-40b6-bfb7-12a157eb4b95-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.681438 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t7tm\" (UniqueName: \"kubernetes.io/projected/df17f317-5914-40b6-bfb7-12a157eb4b95-kube-api-access-7t7tm\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.681464 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df17f317-5914-40b6-bfb7-12a157eb4b95-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.681494 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/df17f317-5914-40b6-bfb7-12a157eb4b95-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.681522 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/df17f317-5914-40b6-bfb7-12a157eb4b95-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.681559 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/df17f317-5914-40b6-bfb7-12a157eb4b95-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.681589 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/df17f317-5914-40b6-bfb7-12a157eb4b95-builder-dockercfg-kmssz-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.681628 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/df17f317-5914-40b6-bfb7-12a157eb4b95-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.681648 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/df17f317-5914-40b6-bfb7-12a157eb4b95-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.681671 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/df17f317-5914-40b6-bfb7-12a157eb4b95-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.681732 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/df17f317-5914-40b6-bfb7-12a157eb4b95-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.682048 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/df17f317-5914-40b6-bfb7-12a157eb4b95-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.682148 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df17f317-5914-40b6-bfb7-12a157eb4b95-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.682216 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/df17f317-5914-40b6-bfb7-12a157eb4b95-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.682427 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/df17f317-5914-40b6-bfb7-12a157eb4b95-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.682671 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/df17f317-5914-40b6-bfb7-12a157eb4b95-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.682817 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/df17f317-5914-40b6-bfb7-12a157eb4b95-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.682911 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/df17f317-5914-40b6-bfb7-12a157eb4b95-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.682912 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df17f317-5914-40b6-bfb7-12a157eb4b95-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.689187 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/df17f317-5914-40b6-bfb7-12a157eb4b95-builder-dockercfg-kmssz-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.700858 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/df17f317-5914-40b6-bfb7-12a157eb4b95-builder-dockercfg-kmssz-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.707751 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t7tm\" (UniqueName: \"kubernetes.io/projected/df17f317-5914-40b6-bfb7-12a157eb4b95-kube-api-access-7t7tm\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.870487 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:35 crc kubenswrapper[4791]: I0217 00:26:35.159261 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Feb 17 00:26:35 crc kubenswrapper[4791]: W0217 00:26:35.173876 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf17f317_5914_40b6_bfb7_12a157eb4b95.slice/crio-21d543c6e21f9f761c2afa3bc899bece8caad13427e2538af83279dce190781e WatchSource:0}: Error finding container 21d543c6e21f9f761c2afa3bc899bece8caad13427e2538af83279dce190781e: Status 404 returned error can't find the container with id 21d543c6e21f9f761c2afa3bc899bece8caad13427e2538af83279dce190781e Feb 17 00:26:35 crc kubenswrapper[4791]: I0217 00:26:35.248771 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4859167c-9cba-498e-85e0-25710c5c93ec" path="/var/lib/kubelet/pods/4859167c-9cba-498e-85e0-25710c5c93ec/volumes" Feb 17 00:26:35 crc kubenswrapper[4791]: I0217 00:26:35.626364 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"df17f317-5914-40b6-bfb7-12a157eb4b95","Type":"ContainerStarted","Data":"21d543c6e21f9f761c2afa3bc899bece8caad13427e2538af83279dce190781e"} Feb 17 00:26:36 crc kubenswrapper[4791]: I0217 00:26:36.634726 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"df17f317-5914-40b6-bfb7-12a157eb4b95","Type":"ContainerStarted","Data":"070a583a962a9fe7b0ef5baf99e88b5a7b72f4c99439f3c4e1e3e4d468484fff"} Feb 17 00:26:37 crc kubenswrapper[4791]: I0217 00:26:37.646669 4791 generic.go:334] "Generic (PLEG): container finished" podID="df17f317-5914-40b6-bfb7-12a157eb4b95" containerID="070a583a962a9fe7b0ef5baf99e88b5a7b72f4c99439f3c4e1e3e4d468484fff" exitCode=0 Feb 17 00:26:37 crc kubenswrapper[4791]: I0217 00:26:37.647826 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"df17f317-5914-40b6-bfb7-12a157eb4b95","Type":"ContainerDied","Data":"070a583a962a9fe7b0ef5baf99e88b5a7b72f4c99439f3c4e1e3e4d468484fff"} Feb 17 00:26:38 crc kubenswrapper[4791]: I0217 00:26:38.657002 4791 generic.go:334] "Generic (PLEG): container finished" podID="df17f317-5914-40b6-bfb7-12a157eb4b95" containerID="5770db58104d06079dbc8cd1063caafc530f7c4ce8a1ab7a39d501a3ecc96f8c" exitCode=0 Feb 17 00:26:38 crc kubenswrapper[4791]: I0217 00:26:38.657160 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"df17f317-5914-40b6-bfb7-12a157eb4b95","Type":"ContainerDied","Data":"5770db58104d06079dbc8cd1063caafc530f7c4ce8a1ab7a39d501a3ecc96f8c"} Feb 17 00:26:38 crc kubenswrapper[4791]: I0217 00:26:38.719123 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-2-build_df17f317-5914-40b6-bfb7-12a157eb4b95/manage-dockerfile/0.log" Feb 17 00:26:39 crc kubenswrapper[4791]: I0217 00:26:39.669525 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"df17f317-5914-40b6-bfb7-12a157eb4b95","Type":"ContainerStarted","Data":"67be1c6ded0357210ae6d498b2673cb5f728955a7c2e04e23f098517f94f7b7e"} Feb 17 00:26:39 crc kubenswrapper[4791]: I0217 00:26:39.749934 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-webhook-snmp-2-build" podStartSLOduration=5.74986321 podStartE2EDuration="5.74986321s" podCreationTimestamp="2026-02-17 00:26:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:26:39.728231488 +0000 UTC m=+1257.207744025" watchObservedRunningTime="2026-02-17 00:26:39.74986321 +0000 UTC m=+1257.229375767" Feb 17 00:26:54 crc kubenswrapper[4791]: I0217 00:26:54.973455 4791 patch_prober.go:28] interesting pod/machine-config-daemon-9klkw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 00:26:54 crc kubenswrapper[4791]: I0217 00:26:54.974119 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 00:26:54 crc kubenswrapper[4791]: I0217 00:26:54.974168 4791 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" Feb 17 00:26:54 crc kubenswrapper[4791]: I0217 00:26:54.974877 4791 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"71eb8be49ce7a773ea08603b5bc6a4c5e5daf5e96a1784fa4c29b0f51c425ed1"} pod="openshift-machine-config-operator/machine-config-daemon-9klkw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 00:26:54 crc kubenswrapper[4791]: I0217 00:26:54.974930 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" containerID="cri-o://71eb8be49ce7a773ea08603b5bc6a4c5e5daf5e96a1784fa4c29b0f51c425ed1" gracePeriod=600 Feb 17 00:26:55 crc kubenswrapper[4791]: I0217 00:26:55.780319 4791 generic.go:334] "Generic (PLEG): container finished" podID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerID="71eb8be49ce7a773ea08603b5bc6a4c5e5daf5e96a1784fa4c29b0f51c425ed1" exitCode=0 Feb 17 00:26:55 crc kubenswrapper[4791]: I0217 00:26:55.780437 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" event={"ID":"02a3a228-86d6-4d54-ad63-0d36c9d59af5","Type":"ContainerDied","Data":"71eb8be49ce7a773ea08603b5bc6a4c5e5daf5e96a1784fa4c29b0f51c425ed1"} Feb 17 00:26:55 crc kubenswrapper[4791]: I0217 00:26:55.780864 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" event={"ID":"02a3a228-86d6-4d54-ad63-0d36c9d59af5","Type":"ContainerStarted","Data":"0ab0ea25cf5596643285f502958b16b41d67465e6ae2b0bd26ff7c3c0dfd7a57"} Feb 17 00:26:55 crc kubenswrapper[4791]: I0217 00:26:55.780909 4791 scope.go:117] "RemoveContainer" containerID="722b3f07ba9ee7144e67186af8cb544101e6caf0a3dcec49ef92e61b07ce2b64" Feb 17 00:27:35 crc kubenswrapper[4791]: I0217 00:27:35.051423 4791 generic.go:334] "Generic (PLEG): container finished" podID="df17f317-5914-40b6-bfb7-12a157eb4b95" containerID="67be1c6ded0357210ae6d498b2673cb5f728955a7c2e04e23f098517f94f7b7e" exitCode=0 Feb 17 00:27:35 crc kubenswrapper[4791]: I0217 00:27:35.051526 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"df17f317-5914-40b6-bfb7-12a157eb4b95","Type":"ContainerDied","Data":"67be1c6ded0357210ae6d498b2673cb5f728955a7c2e04e23f098517f94f7b7e"} Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.311565 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.403630 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/df17f317-5914-40b6-bfb7-12a157eb4b95-buildworkdir\") pod \"df17f317-5914-40b6-bfb7-12a157eb4b95\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.403692 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df17f317-5914-40b6-bfb7-12a157eb4b95-build-ca-bundles\") pod \"df17f317-5914-40b6-bfb7-12a157eb4b95\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.403757 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7t7tm\" (UniqueName: \"kubernetes.io/projected/df17f317-5914-40b6-bfb7-12a157eb4b95-kube-api-access-7t7tm\") pod \"df17f317-5914-40b6-bfb7-12a157eb4b95\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.403785 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/df17f317-5914-40b6-bfb7-12a157eb4b95-buildcachedir\") pod \"df17f317-5914-40b6-bfb7-12a157eb4b95\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.403807 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/df17f317-5914-40b6-bfb7-12a157eb4b95-build-system-configs\") pod \"df17f317-5914-40b6-bfb7-12a157eb4b95\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.403833 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/df17f317-5914-40b6-bfb7-12a157eb4b95-container-storage-root\") pod \"df17f317-5914-40b6-bfb7-12a157eb4b95\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.403854 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df17f317-5914-40b6-bfb7-12a157eb4b95-build-proxy-ca-bundles\") pod \"df17f317-5914-40b6-bfb7-12a157eb4b95\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.403880 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/df17f317-5914-40b6-bfb7-12a157eb4b95-container-storage-run\") pod \"df17f317-5914-40b6-bfb7-12a157eb4b95\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.403895 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/df17f317-5914-40b6-bfb7-12a157eb4b95-build-blob-cache\") pod \"df17f317-5914-40b6-bfb7-12a157eb4b95\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.403914 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/df17f317-5914-40b6-bfb7-12a157eb4b95-builder-dockercfg-kmssz-pull\") pod \"df17f317-5914-40b6-bfb7-12a157eb4b95\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.403930 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/df17f317-5914-40b6-bfb7-12a157eb4b95-builder-dockercfg-kmssz-push\") pod \"df17f317-5914-40b6-bfb7-12a157eb4b95\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.403953 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/df17f317-5914-40b6-bfb7-12a157eb4b95-node-pullsecrets\") pod \"df17f317-5914-40b6-bfb7-12a157eb4b95\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.404166 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df17f317-5914-40b6-bfb7-12a157eb4b95-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "df17f317-5914-40b6-bfb7-12a157eb4b95" (UID: "df17f317-5914-40b6-bfb7-12a157eb4b95"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.404537 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df17f317-5914-40b6-bfb7-12a157eb4b95-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "df17f317-5914-40b6-bfb7-12a157eb4b95" (UID: "df17f317-5914-40b6-bfb7-12a157eb4b95"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.405217 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df17f317-5914-40b6-bfb7-12a157eb4b95-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "df17f317-5914-40b6-bfb7-12a157eb4b95" (UID: "df17f317-5914-40b6-bfb7-12a157eb4b95"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.405822 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df17f317-5914-40b6-bfb7-12a157eb4b95-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "df17f317-5914-40b6-bfb7-12a157eb4b95" (UID: "df17f317-5914-40b6-bfb7-12a157eb4b95"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.406513 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df17f317-5914-40b6-bfb7-12a157eb4b95-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "df17f317-5914-40b6-bfb7-12a157eb4b95" (UID: "df17f317-5914-40b6-bfb7-12a157eb4b95"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.407281 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df17f317-5914-40b6-bfb7-12a157eb4b95-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "df17f317-5914-40b6-bfb7-12a157eb4b95" (UID: "df17f317-5914-40b6-bfb7-12a157eb4b95"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.407430 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df17f317-5914-40b6-bfb7-12a157eb4b95-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "df17f317-5914-40b6-bfb7-12a157eb4b95" (UID: "df17f317-5914-40b6-bfb7-12a157eb4b95"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.416277 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df17f317-5914-40b6-bfb7-12a157eb4b95-kube-api-access-7t7tm" (OuterVolumeSpecName: "kube-api-access-7t7tm") pod "df17f317-5914-40b6-bfb7-12a157eb4b95" (UID: "df17f317-5914-40b6-bfb7-12a157eb4b95"). InnerVolumeSpecName "kube-api-access-7t7tm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.416340 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df17f317-5914-40b6-bfb7-12a157eb4b95-builder-dockercfg-kmssz-push" (OuterVolumeSpecName: "builder-dockercfg-kmssz-push") pod "df17f317-5914-40b6-bfb7-12a157eb4b95" (UID: "df17f317-5914-40b6-bfb7-12a157eb4b95"). InnerVolumeSpecName "builder-dockercfg-kmssz-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.416412 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df17f317-5914-40b6-bfb7-12a157eb4b95-builder-dockercfg-kmssz-pull" (OuterVolumeSpecName: "builder-dockercfg-kmssz-pull") pod "df17f317-5914-40b6-bfb7-12a157eb4b95" (UID: "df17f317-5914-40b6-bfb7-12a157eb4b95"). InnerVolumeSpecName "builder-dockercfg-kmssz-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.505781 4791 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/df17f317-5914-40b6-bfb7-12a157eb4b95-buildworkdir\") on node \"crc\" DevicePath \"\"" Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.505827 4791 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df17f317-5914-40b6-bfb7-12a157eb4b95-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.505844 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7t7tm\" (UniqueName: \"kubernetes.io/projected/df17f317-5914-40b6-bfb7-12a157eb4b95-kube-api-access-7t7tm\") on node \"crc\" DevicePath \"\"" Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.505861 4791 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/df17f317-5914-40b6-bfb7-12a157eb4b95-buildcachedir\") on node \"crc\" DevicePath \"\"" Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.505878 4791 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/df17f317-5914-40b6-bfb7-12a157eb4b95-build-system-configs\") on node \"crc\" DevicePath \"\"" Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.505894 4791 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df17f317-5914-40b6-bfb7-12a157eb4b95-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.505909 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/df17f317-5914-40b6-bfb7-12a157eb4b95-container-storage-run\") on node \"crc\" DevicePath \"\"" Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.505925 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/df17f317-5914-40b6-bfb7-12a157eb4b95-builder-dockercfg-kmssz-pull\") on node \"crc\" DevicePath \"\"" Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.505940 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/df17f317-5914-40b6-bfb7-12a157eb4b95-builder-dockercfg-kmssz-push\") on node \"crc\" DevicePath \"\"" Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.505955 4791 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/df17f317-5914-40b6-bfb7-12a157eb4b95-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.547602 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df17f317-5914-40b6-bfb7-12a157eb4b95-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "df17f317-5914-40b6-bfb7-12a157eb4b95" (UID: "df17f317-5914-40b6-bfb7-12a157eb4b95"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.607788 4791 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/df17f317-5914-40b6-bfb7-12a157eb4b95-build-blob-cache\") on node \"crc\" DevicePath \"\"" Feb 17 00:27:37 crc kubenswrapper[4791]: I0217 00:27:37.074203 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"df17f317-5914-40b6-bfb7-12a157eb4b95","Type":"ContainerDied","Data":"21d543c6e21f9f761c2afa3bc899bece8caad13427e2538af83279dce190781e"} Feb 17 00:27:37 crc kubenswrapper[4791]: I0217 00:27:37.074260 4791 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21d543c6e21f9f761c2afa3bc899bece8caad13427e2538af83279dce190781e" Feb 17 00:27:37 crc kubenswrapper[4791]: I0217 00:27:37.074387 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:27:37 crc kubenswrapper[4791]: I0217 00:27:37.397676 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df17f317-5914-40b6-bfb7-12a157eb4b95-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "df17f317-5914-40b6-bfb7-12a157eb4b95" (UID: "df17f317-5914-40b6-bfb7-12a157eb4b95"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:27:37 crc kubenswrapper[4791]: I0217 00:27:37.418144 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/df17f317-5914-40b6-bfb7-12a157eb4b95-container-storage-root\") on node \"crc\" DevicePath \"\"" Feb 17 00:27:41 crc kubenswrapper[4791]: I0217 00:27:41.983753 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-6f787cb998-k5dw6"] Feb 17 00:27:41 crc kubenswrapper[4791]: E0217 00:27:41.984576 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df17f317-5914-40b6-bfb7-12a157eb4b95" containerName="docker-build" Feb 17 00:27:41 crc kubenswrapper[4791]: I0217 00:27:41.984590 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="df17f317-5914-40b6-bfb7-12a157eb4b95" containerName="docker-build" Feb 17 00:27:41 crc kubenswrapper[4791]: E0217 00:27:41.984614 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df17f317-5914-40b6-bfb7-12a157eb4b95" containerName="manage-dockerfile" Feb 17 00:27:41 crc kubenswrapper[4791]: I0217 00:27:41.984620 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="df17f317-5914-40b6-bfb7-12a157eb4b95" containerName="manage-dockerfile" Feb 17 00:27:41 crc kubenswrapper[4791]: E0217 00:27:41.984628 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df17f317-5914-40b6-bfb7-12a157eb4b95" containerName="git-clone" Feb 17 00:27:41 crc kubenswrapper[4791]: I0217 00:27:41.984634 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="df17f317-5914-40b6-bfb7-12a157eb4b95" containerName="git-clone" Feb 17 00:27:41 crc kubenswrapper[4791]: I0217 00:27:41.984735 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="df17f317-5914-40b6-bfb7-12a157eb4b95" containerName="docker-build" Feb 17 00:27:41 crc kubenswrapper[4791]: I0217 00:27:41.985222 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-6f787cb998-k5dw6" Feb 17 00:27:41 crc kubenswrapper[4791]: I0217 00:27:41.987173 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-operator-dockercfg-p8dw4" Feb 17 00:27:42 crc kubenswrapper[4791]: I0217 00:27:42.000251 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-6f787cb998-k5dw6"] Feb 17 00:27:42 crc kubenswrapper[4791]: I0217 00:27:42.074526 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/29c809ce-6a9b-4496-9c8e-8cd4506d926b-runner\") pod \"smart-gateway-operator-6f787cb998-k5dw6\" (UID: \"29c809ce-6a9b-4496-9c8e-8cd4506d926b\") " pod="service-telemetry/smart-gateway-operator-6f787cb998-k5dw6" Feb 17 00:27:42 crc kubenswrapper[4791]: I0217 00:27:42.074603 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmllp\" (UniqueName: \"kubernetes.io/projected/29c809ce-6a9b-4496-9c8e-8cd4506d926b-kube-api-access-hmllp\") pod \"smart-gateway-operator-6f787cb998-k5dw6\" (UID: \"29c809ce-6a9b-4496-9c8e-8cd4506d926b\") " pod="service-telemetry/smart-gateway-operator-6f787cb998-k5dw6" Feb 17 00:27:42 crc kubenswrapper[4791]: I0217 00:27:42.175331 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/29c809ce-6a9b-4496-9c8e-8cd4506d926b-runner\") pod \"smart-gateway-operator-6f787cb998-k5dw6\" (UID: \"29c809ce-6a9b-4496-9c8e-8cd4506d926b\") " pod="service-telemetry/smart-gateway-operator-6f787cb998-k5dw6" Feb 17 00:27:42 crc kubenswrapper[4791]: I0217 00:27:42.175396 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmllp\" (UniqueName: \"kubernetes.io/projected/29c809ce-6a9b-4496-9c8e-8cd4506d926b-kube-api-access-hmllp\") pod \"smart-gateway-operator-6f787cb998-k5dw6\" (UID: \"29c809ce-6a9b-4496-9c8e-8cd4506d926b\") " pod="service-telemetry/smart-gateway-operator-6f787cb998-k5dw6" Feb 17 00:27:42 crc kubenswrapper[4791]: I0217 00:27:42.175906 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/29c809ce-6a9b-4496-9c8e-8cd4506d926b-runner\") pod \"smart-gateway-operator-6f787cb998-k5dw6\" (UID: \"29c809ce-6a9b-4496-9c8e-8cd4506d926b\") " pod="service-telemetry/smart-gateway-operator-6f787cb998-k5dw6" Feb 17 00:27:42 crc kubenswrapper[4791]: I0217 00:27:42.196896 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmllp\" (UniqueName: \"kubernetes.io/projected/29c809ce-6a9b-4496-9c8e-8cd4506d926b-kube-api-access-hmllp\") pod \"smart-gateway-operator-6f787cb998-k5dw6\" (UID: \"29c809ce-6a9b-4496-9c8e-8cd4506d926b\") " pod="service-telemetry/smart-gateway-operator-6f787cb998-k5dw6" Feb 17 00:27:42 crc kubenswrapper[4791]: I0217 00:27:42.321412 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-6f787cb998-k5dw6" Feb 17 00:27:42 crc kubenswrapper[4791]: I0217 00:27:42.512338 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-6f787cb998-k5dw6"] Feb 17 00:27:42 crc kubenswrapper[4791]: W0217 00:27:42.514577 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29c809ce_6a9b_4496_9c8e_8cd4506d926b.slice/crio-168a2747279fbb64ed777ab6ddd8cd204508eb72f7c55148d7923e2bf2c5b708 WatchSource:0}: Error finding container 168a2747279fbb64ed777ab6ddd8cd204508eb72f7c55148d7923e2bf2c5b708: Status 404 returned error can't find the container with id 168a2747279fbb64ed777ab6ddd8cd204508eb72f7c55148d7923e2bf2c5b708 Feb 17 00:27:42 crc kubenswrapper[4791]: I0217 00:27:42.517203 4791 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 00:27:43 crc kubenswrapper[4791]: I0217 00:27:43.115298 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-6f787cb998-k5dw6" event={"ID":"29c809ce-6a9b-4496-9c8e-8cd4506d926b","Type":"ContainerStarted","Data":"168a2747279fbb64ed777ab6ddd8cd204508eb72f7c55148d7923e2bf2c5b708"} Feb 17 00:27:48 crc kubenswrapper[4791]: I0217 00:27:48.257265 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-6f88b4fbc-h6xn7"] Feb 17 00:27:48 crc kubenswrapper[4791]: I0217 00:27:48.258689 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-6f88b4fbc-h6xn7" Feb 17 00:27:48 crc kubenswrapper[4791]: I0217 00:27:48.262266 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"service-telemetry-operator-dockercfg-hwbvw" Feb 17 00:27:48 crc kubenswrapper[4791]: I0217 00:27:48.275884 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-6f88b4fbc-h6xn7"] Feb 17 00:27:48 crc kubenswrapper[4791]: I0217 00:27:48.358721 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/c8e979be-fe5a-4d89-b1a6-0260fffdd27c-runner\") pod \"service-telemetry-operator-6f88b4fbc-h6xn7\" (UID: \"c8e979be-fe5a-4d89-b1a6-0260fffdd27c\") " pod="service-telemetry/service-telemetry-operator-6f88b4fbc-h6xn7" Feb 17 00:27:48 crc kubenswrapper[4791]: I0217 00:27:48.358814 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvpgt\" (UniqueName: \"kubernetes.io/projected/c8e979be-fe5a-4d89-b1a6-0260fffdd27c-kube-api-access-dvpgt\") pod \"service-telemetry-operator-6f88b4fbc-h6xn7\" (UID: \"c8e979be-fe5a-4d89-b1a6-0260fffdd27c\") " pod="service-telemetry/service-telemetry-operator-6f88b4fbc-h6xn7" Feb 17 00:27:48 crc kubenswrapper[4791]: I0217 00:27:48.459576 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/c8e979be-fe5a-4d89-b1a6-0260fffdd27c-runner\") pod \"service-telemetry-operator-6f88b4fbc-h6xn7\" (UID: \"c8e979be-fe5a-4d89-b1a6-0260fffdd27c\") " pod="service-telemetry/service-telemetry-operator-6f88b4fbc-h6xn7" Feb 17 00:27:48 crc kubenswrapper[4791]: I0217 00:27:48.459848 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvpgt\" (UniqueName: \"kubernetes.io/projected/c8e979be-fe5a-4d89-b1a6-0260fffdd27c-kube-api-access-dvpgt\") pod \"service-telemetry-operator-6f88b4fbc-h6xn7\" (UID: \"c8e979be-fe5a-4d89-b1a6-0260fffdd27c\") " pod="service-telemetry/service-telemetry-operator-6f88b4fbc-h6xn7" Feb 17 00:27:48 crc kubenswrapper[4791]: I0217 00:27:48.460054 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/c8e979be-fe5a-4d89-b1a6-0260fffdd27c-runner\") pod \"service-telemetry-operator-6f88b4fbc-h6xn7\" (UID: \"c8e979be-fe5a-4d89-b1a6-0260fffdd27c\") " pod="service-telemetry/service-telemetry-operator-6f88b4fbc-h6xn7" Feb 17 00:27:48 crc kubenswrapper[4791]: I0217 00:27:48.479536 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvpgt\" (UniqueName: \"kubernetes.io/projected/c8e979be-fe5a-4d89-b1a6-0260fffdd27c-kube-api-access-dvpgt\") pod \"service-telemetry-operator-6f88b4fbc-h6xn7\" (UID: \"c8e979be-fe5a-4d89-b1a6-0260fffdd27c\") " pod="service-telemetry/service-telemetry-operator-6f88b4fbc-h6xn7" Feb 17 00:27:48 crc kubenswrapper[4791]: I0217 00:27:48.578871 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-6f88b4fbc-h6xn7" Feb 17 00:27:52 crc kubenswrapper[4791]: I0217 00:27:52.912496 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-6f88b4fbc-h6xn7"] Feb 17 00:27:57 crc kubenswrapper[4791]: W0217 00:27:57.487042 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8e979be_fe5a_4d89_b1a6_0260fffdd27c.slice/crio-aeed2189cc9c95e95edc14791d1d728e87eee6d7fddce45f2a69ab31f3ad974c WatchSource:0}: Error finding container aeed2189cc9c95e95edc14791d1d728e87eee6d7fddce45f2a69ab31f3ad974c: Status 404 returned error can't find the container with id aeed2189cc9c95e95edc14791d1d728e87eee6d7fddce45f2a69ab31f3ad974c Feb 17 00:27:58 crc kubenswrapper[4791]: I0217 00:27:58.221501 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-6f88b4fbc-h6xn7" event={"ID":"c8e979be-fe5a-4d89-b1a6-0260fffdd27c","Type":"ContainerStarted","Data":"aeed2189cc9c95e95edc14791d1d728e87eee6d7fddce45f2a69ab31f3ad974c"} Feb 17 00:27:59 crc kubenswrapper[4791]: E0217 00:27:59.065614 4791 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/infrawatch/smart-gateway-operator:latest" Feb 17 00:27:59 crc kubenswrapper[4791]: E0217 00:27:59.065839 4791 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/infrawatch/smart-gateway-operator:latest,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:WATCH_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.annotations['olm.targetNamespaces'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:OPERATOR_NAME,Value:smart-gateway-operator,ValueFrom:nil,},EnvVar{Name:ANSIBLE_GATHERING,Value:explicit,ValueFrom:nil,},EnvVar{Name:ANSIBLE_VERBOSITY_SMARTGATEWAY_SMARTGATEWAY_INFRA_WATCH,Value:4,ValueFrom:nil,},EnvVar{Name:ANSIBLE_DEBUG_LOGS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CORE_SMARTGATEWAY_IMAGE,Value:image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BRIDGE_SMARTGATEWAY_IMAGE,Value:image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-bridge:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OAUTH_PROXY_IMAGE,Value:quay.io/openshift/origin-oauth-proxy:latest,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:smart-gateway-operator.v5.0.1771288058,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:runner,ReadOnly:false,MountPath:/tmp/ansible-operator/runner,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hmllp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod smart-gateway-operator-6f787cb998-k5dw6_service-telemetry(29c809ce-6a9b-4496-9c8e-8cd4506d926b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 00:27:59 crc kubenswrapper[4791]: E0217 00:27:59.067207 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/smart-gateway-operator-6f787cb998-k5dw6" podUID="29c809ce-6a9b-4496-9c8e-8cd4506d926b" Feb 17 00:27:59 crc kubenswrapper[4791]: E0217 00:27:59.229556 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/infrawatch/smart-gateway-operator:latest\\\"\"" pod="service-telemetry/smart-gateway-operator-6f787cb998-k5dw6" podUID="29c809ce-6a9b-4496-9c8e-8cd4506d926b" Feb 17 00:28:04 crc kubenswrapper[4791]: I0217 00:28:04.264928 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-6f88b4fbc-h6xn7" event={"ID":"c8e979be-fe5a-4d89-b1a6-0260fffdd27c","Type":"ContainerStarted","Data":"dcc4b3416ec3b357c3c1d368d158eb7719bdb7e72028515c225f4b5839c54a14"} Feb 17 00:28:04 crc kubenswrapper[4791]: I0217 00:28:04.302943 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-6f88b4fbc-h6xn7" podStartSLOduration=10.209074012 podStartE2EDuration="16.302914872s" podCreationTimestamp="2026-02-17 00:27:48 +0000 UTC" firstStartedPulling="2026-02-17 00:27:57.492655625 +0000 UTC m=+1334.972168152" lastFinishedPulling="2026-02-17 00:28:03.586496445 +0000 UTC m=+1341.066009012" observedRunningTime="2026-02-17 00:28:04.28962646 +0000 UTC m=+1341.769139027" watchObservedRunningTime="2026-02-17 00:28:04.302914872 +0000 UTC m=+1341.782427439" Feb 17 00:28:13 crc kubenswrapper[4791]: I0217 00:28:13.334708 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-6f787cb998-k5dw6" event={"ID":"29c809ce-6a9b-4496-9c8e-8cd4506d926b","Type":"ContainerStarted","Data":"15fc434018528a24e3e0c2d2ad32c5b44b2835eea1569d6d4d295be2c2eb389b"} Feb 17 00:28:13 crc kubenswrapper[4791]: I0217 00:28:13.354476 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-6f787cb998-k5dw6" podStartSLOduration=1.977734094 podStartE2EDuration="32.354457311s" podCreationTimestamp="2026-02-17 00:27:41 +0000 UTC" firstStartedPulling="2026-02-17 00:27:42.516913923 +0000 UTC m=+1319.996426450" lastFinishedPulling="2026-02-17 00:28:12.8936371 +0000 UTC m=+1350.373149667" observedRunningTime="2026-02-17 00:28:13.352537922 +0000 UTC m=+1350.832050489" watchObservedRunningTime="2026-02-17 00:28:13.354457311 +0000 UTC m=+1350.833969858" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.455524 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-7q5s9"] Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.456927 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.459606 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-users" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.459790 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-credentials" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.460047 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-interconnect-sasl-config" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.460743 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-credentials" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.460872 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-ca" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.461641 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-dockercfg-c5cq6" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.461912 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-ca" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.484908 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-7q5s9"] Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.518760 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-sasl-users\") pod \"default-interconnect-68864d46cb-7q5s9\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.518837 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-7q5s9\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.518923 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt7d8\" (UniqueName: \"kubernetes.io/projected/99be99ec-fe95-419c-ba3f-b4e3601e433a-kube-api-access-qt7d8\") pod \"default-interconnect-68864d46cb-7q5s9\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.518966 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-7q5s9\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.519024 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-7q5s9\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.519067 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-7q5s9\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.519127 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/99be99ec-fe95-419c-ba3f-b4e3601e433a-sasl-config\") pod \"default-interconnect-68864d46cb-7q5s9\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.620660 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/99be99ec-fe95-419c-ba3f-b4e3601e433a-sasl-config\") pod \"default-interconnect-68864d46cb-7q5s9\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.620725 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-sasl-users\") pod \"default-interconnect-68864d46cb-7q5s9\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.620761 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-7q5s9\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.620800 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qt7d8\" (UniqueName: \"kubernetes.io/projected/99be99ec-fe95-419c-ba3f-b4e3601e433a-kube-api-access-qt7d8\") pod \"default-interconnect-68864d46cb-7q5s9\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.620833 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-7q5s9\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.620871 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-7q5s9\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.620902 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-7q5s9\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.621657 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/99be99ec-fe95-419c-ba3f-b4e3601e433a-sasl-config\") pod \"default-interconnect-68864d46cb-7q5s9\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.627481 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-7q5s9\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.627579 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-sasl-users\") pod \"default-interconnect-68864d46cb-7q5s9\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.627662 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-7q5s9\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.628305 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-7q5s9\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.641508 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-7q5s9\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.649238 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt7d8\" (UniqueName: \"kubernetes.io/projected/99be99ec-fe95-419c-ba3f-b4e3601e433a-kube-api-access-qt7d8\") pod \"default-interconnect-68864d46cb-7q5s9\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.791358 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:28:27 crc kubenswrapper[4791]: I0217 00:28:27.219554 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-7q5s9"] Feb 17 00:28:27 crc kubenswrapper[4791]: I0217 00:28:27.422783 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" event={"ID":"99be99ec-fe95-419c-ba3f-b4e3601e433a","Type":"ContainerStarted","Data":"b23f71e1251fc3bfc8c2584134054c100e44edda849e4b8dcd9907f81f1b91b4"} Feb 17 00:28:32 crc kubenswrapper[4791]: I0217 00:28:32.460000 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" event={"ID":"99be99ec-fe95-419c-ba3f-b4e3601e433a","Type":"ContainerStarted","Data":"5c0b32a599acff163bd3b75453552de8cfb3a2023cfaf54423af8eea541e17ba"} Feb 17 00:28:32 crc kubenswrapper[4791]: I0217 00:28:32.483937 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" podStartSLOduration=1.87580377 podStartE2EDuration="6.483917338s" podCreationTimestamp="2026-02-17 00:28:26 +0000 UTC" firstStartedPulling="2026-02-17 00:28:27.22756315 +0000 UTC m=+1364.707075677" lastFinishedPulling="2026-02-17 00:28:31.835676718 +0000 UTC m=+1369.315189245" observedRunningTime="2026-02-17 00:28:32.480205262 +0000 UTC m=+1369.959717799" watchObservedRunningTime="2026-02-17 00:28:32.483917338 +0000 UTC m=+1369.963429865" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.613376 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-default-0"] Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.615423 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.619594 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default-tls-assets-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.619680 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default-web-config" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.619889 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.619950 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-2" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.620035 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-prometheus-proxy-tls" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.620160 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-session-secret" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.619614 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-stf-dockercfg-7fpb8" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.620299 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-1" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.620500 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"serving-certs-ca-bundle" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.620736 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.636306 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-default-0"] Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.667993 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.668280 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-tls-assets\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.668373 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-config\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.668466 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.668589 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.668689 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvrbh\" (UniqueName: \"kubernetes.io/projected/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-kube-api-access-mvrbh\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.668798 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b356babb-a624-4180-bc64-21d7c7e19a71\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b356babb-a624-4180-bc64-21d7c7e19a71\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.668924 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-config-out\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.668998 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.669232 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-web-config\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.669395 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.669537 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.771356 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-config\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.771449 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.771591 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.771652 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvrbh\" (UniqueName: \"kubernetes.io/projected/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-kube-api-access-mvrbh\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.771735 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b356babb-a624-4180-bc64-21d7c7e19a71\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b356babb-a624-4180-bc64-21d7c7e19a71\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.772496 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.772525 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.772155 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.773222 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-config-out\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.773286 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-web-config\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.773363 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.773423 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.773473 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.773504 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-tls-assets\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: E0217 00:28:36.773653 4791 secret.go:188] Couldn't get secret service-telemetry/default-prometheus-proxy-tls: secret "default-prometheus-proxy-tls" not found Feb 17 00:28:36 crc kubenswrapper[4791]: E0217 00:28:36.773714 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-secret-default-prometheus-proxy-tls podName:af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9 nodeName:}" failed. No retries permitted until 2026-02-17 00:28:37.273698915 +0000 UTC m=+1374.753211532 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-default-prometheus-proxy-tls" (UniqueName: "kubernetes.io/secret/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-secret-default-prometheus-proxy-tls") pod "prometheus-default-0" (UID: "af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9") : secret "default-prometheus-proxy-tls" not found Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.774240 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.774473 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.779437 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-config\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.780484 4791 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.780523 4791 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b356babb-a624-4180-bc64-21d7c7e19a71\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b356babb-a624-4180-bc64-21d7c7e19a71\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/15b97e0c19fd885ae252fc1669bf27c70a61564e4d934a06044237c0a873e999/globalmount\"" pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.788591 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-web-config\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.791146 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-tls-assets\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.791270 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-config-out\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.792345 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.796852 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvrbh\" (UniqueName: \"kubernetes.io/projected/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-kube-api-access-mvrbh\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.802421 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b356babb-a624-4180-bc64-21d7c7e19a71\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b356babb-a624-4180-bc64-21d7c7e19a71\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:37 crc kubenswrapper[4791]: I0217 00:28:37.279529 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:37 crc kubenswrapper[4791]: E0217 00:28:37.279750 4791 secret.go:188] Couldn't get secret service-telemetry/default-prometheus-proxy-tls: secret "default-prometheus-proxy-tls" not found Feb 17 00:28:37 crc kubenswrapper[4791]: E0217 00:28:37.279810 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-secret-default-prometheus-proxy-tls podName:af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9 nodeName:}" failed. No retries permitted until 2026-02-17 00:28:38.279789598 +0000 UTC m=+1375.759302125 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-default-prometheus-proxy-tls" (UniqueName: "kubernetes.io/secret/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-secret-default-prometheus-proxy-tls") pod "prometheus-default-0" (UID: "af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9") : secret "default-prometheus-proxy-tls" not found Feb 17 00:28:38 crc kubenswrapper[4791]: I0217 00:28:38.294503 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:38 crc kubenswrapper[4791]: I0217 00:28:38.302179 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:38 crc kubenswrapper[4791]: I0217 00:28:38.439240 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-default-0" Feb 17 00:28:38 crc kubenswrapper[4791]: I0217 00:28:38.898066 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-default-0"] Feb 17 00:28:39 crc kubenswrapper[4791]: I0217 00:28:39.511039 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9","Type":"ContainerStarted","Data":"3a7d02d6cf2e9460b02c43ab7b535613692b3989086ccd32bff5782a33504cd8"} Feb 17 00:28:44 crc kubenswrapper[4791]: I0217 00:28:44.544362 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9","Type":"ContainerStarted","Data":"800a003e4a34de46b54655e05480a4349c1d5a3d6b4843b985ca73d32a00cac5"} Feb 17 00:28:46 crc kubenswrapper[4791]: I0217 00:28:46.517411 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-z8m2z"] Feb 17 00:28:46 crc kubenswrapper[4791]: I0217 00:28:46.518643 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-snmp-webhook-6856cfb745-z8m2z" Feb 17 00:28:46 crc kubenswrapper[4791]: I0217 00:28:46.542960 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-z8m2z"] Feb 17 00:28:46 crc kubenswrapper[4791]: I0217 00:28:46.612403 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj5wt\" (UniqueName: \"kubernetes.io/projected/135a7c9c-cdfa-4baa-a4b3-ea9f6392d1cd-kube-api-access-cj5wt\") pod \"default-snmp-webhook-6856cfb745-z8m2z\" (UID: \"135a7c9c-cdfa-4baa-a4b3-ea9f6392d1cd\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-z8m2z" Feb 17 00:28:46 crc kubenswrapper[4791]: I0217 00:28:46.713779 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cj5wt\" (UniqueName: \"kubernetes.io/projected/135a7c9c-cdfa-4baa-a4b3-ea9f6392d1cd-kube-api-access-cj5wt\") pod \"default-snmp-webhook-6856cfb745-z8m2z\" (UID: \"135a7c9c-cdfa-4baa-a4b3-ea9f6392d1cd\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-z8m2z" Feb 17 00:28:46 crc kubenswrapper[4791]: I0217 00:28:46.738700 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj5wt\" (UniqueName: \"kubernetes.io/projected/135a7c9c-cdfa-4baa-a4b3-ea9f6392d1cd-kube-api-access-cj5wt\") pod \"default-snmp-webhook-6856cfb745-z8m2z\" (UID: \"135a7c9c-cdfa-4baa-a4b3-ea9f6392d1cd\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-z8m2z" Feb 17 00:28:46 crc kubenswrapper[4791]: I0217 00:28:46.851904 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-snmp-webhook-6856cfb745-z8m2z" Feb 17 00:28:47 crc kubenswrapper[4791]: I0217 00:28:47.085963 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-z8m2z"] Feb 17 00:28:47 crc kubenswrapper[4791]: W0217 00:28:47.087720 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod135a7c9c_cdfa_4baa_a4b3_ea9f6392d1cd.slice/crio-3f8026074708c1af4cdec352e7a4e4d7742820b010c878e8c08816f06add69fa WatchSource:0}: Error finding container 3f8026074708c1af4cdec352e7a4e4d7742820b010c878e8c08816f06add69fa: Status 404 returned error can't find the container with id 3f8026074708c1af4cdec352e7a4e4d7742820b010c878e8c08816f06add69fa Feb 17 00:28:47 crc kubenswrapper[4791]: I0217 00:28:47.567677 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-snmp-webhook-6856cfb745-z8m2z" event={"ID":"135a7c9c-cdfa-4baa-a4b3-ea9f6392d1cd","Type":"ContainerStarted","Data":"3f8026074708c1af4cdec352e7a4e4d7742820b010c878e8c08816f06add69fa"} Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.296372 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/alertmanager-default-0"] Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.298452 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.306059 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-generated" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.306437 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-web-config" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.306629 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-tls-assets-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.306792 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-stf-dockercfg-9svbl" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.310449 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-cluster-tls-config" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.310627 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-alertmanager-proxy-tls" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.311667 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/alertmanager-default-0"] Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.379850 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/df247d19-621c-4c9b-a436-d4f263dcb5ae-config-volume\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.379892 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/df247d19-621c-4c9b-a436-d4f263dcb5ae-config-out\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.379930 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/df247d19-621c-4c9b-a436-d4f263dcb5ae-web-config\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.379989 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/df247d19-621c-4c9b-a436-d4f263dcb5ae-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.380010 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/df247d19-621c-4c9b-a436-d4f263dcb5ae-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.380033 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8e329ac4-40a0-4d17-b294-c3e266564e22\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8e329ac4-40a0-4d17-b294-c3e266564e22\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.380061 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/df247d19-621c-4c9b-a436-d4f263dcb5ae-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.380144 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/df247d19-621c-4c9b-a436-d4f263dcb5ae-tls-assets\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.380175 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dlzf\" (UniqueName: \"kubernetes.io/projected/df247d19-621c-4c9b-a436-d4f263dcb5ae-kube-api-access-9dlzf\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.480841 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/df247d19-621c-4c9b-a436-d4f263dcb5ae-tls-assets\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.481061 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dlzf\" (UniqueName: \"kubernetes.io/projected/df247d19-621c-4c9b-a436-d4f263dcb5ae-kube-api-access-9dlzf\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.481199 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/df247d19-621c-4c9b-a436-d4f263dcb5ae-config-volume\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.481275 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/df247d19-621c-4c9b-a436-d4f263dcb5ae-config-out\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.481364 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/df247d19-621c-4c9b-a436-d4f263dcb5ae-web-config\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.481443 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/df247d19-621c-4c9b-a436-d4f263dcb5ae-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.481517 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/df247d19-621c-4c9b-a436-d4f263dcb5ae-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.481596 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8e329ac4-40a0-4d17-b294-c3e266564e22\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8e329ac4-40a0-4d17-b294-c3e266564e22\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.481664 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/df247d19-621c-4c9b-a436-d4f263dcb5ae-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: E0217 00:28:50.481840 4791 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Feb 17 00:28:50 crc kubenswrapper[4791]: E0217 00:28:50.481938 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df247d19-621c-4c9b-a436-d4f263dcb5ae-secret-default-alertmanager-proxy-tls podName:df247d19-621c-4c9b-a436-d4f263dcb5ae nodeName:}" failed. No retries permitted until 2026-02-17 00:28:50.981921939 +0000 UTC m=+1388.461434466 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/df247d19-621c-4c9b-a436-d4f263dcb5ae-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "df247d19-621c-4c9b-a436-d4f263dcb5ae") : secret "default-alertmanager-proxy-tls" not found Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.494723 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/df247d19-621c-4c9b-a436-d4f263dcb5ae-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.494964 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/df247d19-621c-4c9b-a436-d4f263dcb5ae-config-out\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.495435 4791 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.495486 4791 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8e329ac4-40a0-4d17-b294-c3e266564e22\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8e329ac4-40a0-4d17-b294-c3e266564e22\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/51638830e311897aa0d4241a4b7178f92cff69c8608a060b8511b181cc9935b1/globalmount\"" pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.500713 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/df247d19-621c-4c9b-a436-d4f263dcb5ae-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.501417 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/df247d19-621c-4c9b-a436-d4f263dcb5ae-config-volume\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.502789 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/df247d19-621c-4c9b-a436-d4f263dcb5ae-tls-assets\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.507092 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/df247d19-621c-4c9b-a436-d4f263dcb5ae-web-config\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.521244 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dlzf\" (UniqueName: \"kubernetes.io/projected/df247d19-621c-4c9b-a436-d4f263dcb5ae-kube-api-access-9dlzf\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.577941 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8e329ac4-40a0-4d17-b294-c3e266564e22\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8e329ac4-40a0-4d17-b294-c3e266564e22\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.990350 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/df247d19-621c-4c9b-a436-d4f263dcb5ae-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: E0217 00:28:50.990611 4791 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Feb 17 00:28:50 crc kubenswrapper[4791]: E0217 00:28:50.991280 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df247d19-621c-4c9b-a436-d4f263dcb5ae-secret-default-alertmanager-proxy-tls podName:df247d19-621c-4c9b-a436-d4f263dcb5ae nodeName:}" failed. No retries permitted until 2026-02-17 00:28:51.991246273 +0000 UTC m=+1389.470758830 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/df247d19-621c-4c9b-a436-d4f263dcb5ae-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "df247d19-621c-4c9b-a436-d4f263dcb5ae") : secret "default-alertmanager-proxy-tls" not found Feb 17 00:28:51 crc kubenswrapper[4791]: I0217 00:28:51.596947 4791 generic.go:334] "Generic (PLEG): container finished" podID="af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9" containerID="800a003e4a34de46b54655e05480a4349c1d5a3d6b4843b985ca73d32a00cac5" exitCode=0 Feb 17 00:28:51 crc kubenswrapper[4791]: I0217 00:28:51.596987 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9","Type":"ContainerDied","Data":"800a003e4a34de46b54655e05480a4349c1d5a3d6b4843b985ca73d32a00cac5"} Feb 17 00:28:52 crc kubenswrapper[4791]: I0217 00:28:52.010555 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/df247d19-621c-4c9b-a436-d4f263dcb5ae-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:52 crc kubenswrapper[4791]: E0217 00:28:52.010827 4791 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Feb 17 00:28:52 crc kubenswrapper[4791]: E0217 00:28:52.010908 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df247d19-621c-4c9b-a436-d4f263dcb5ae-secret-default-alertmanager-proxy-tls podName:df247d19-621c-4c9b-a436-d4f263dcb5ae nodeName:}" failed. No retries permitted until 2026-02-17 00:28:54.010886712 +0000 UTC m=+1391.490399239 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/df247d19-621c-4c9b-a436-d4f263dcb5ae-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "df247d19-621c-4c9b-a436-d4f263dcb5ae") : secret "default-alertmanager-proxy-tls" not found Feb 17 00:28:54 crc kubenswrapper[4791]: I0217 00:28:54.034808 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/df247d19-621c-4c9b-a436-d4f263dcb5ae-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:54 crc kubenswrapper[4791]: I0217 00:28:54.040968 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/df247d19-621c-4c9b-a436-d4f263dcb5ae-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:54 crc kubenswrapper[4791]: I0217 00:28:54.223217 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:55 crc kubenswrapper[4791]: I0217 00:28:55.979894 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/alertmanager-default-0"] Feb 17 00:28:56 crc kubenswrapper[4791]: I0217 00:28:56.643602 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"df247d19-621c-4c9b-a436-d4f263dcb5ae","Type":"ContainerStarted","Data":"9a8658cf5d59a0c49e5ceb447e7880e0e70a7fd978c3817b4a367f2f94def0e4"} Feb 17 00:28:56 crc kubenswrapper[4791]: I0217 00:28:56.645064 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-snmp-webhook-6856cfb745-z8m2z" event={"ID":"135a7c9c-cdfa-4baa-a4b3-ea9f6392d1cd","Type":"ContainerStarted","Data":"ba9c3932a458788ed268f59e278169101bc6fe2285f0cd5930490be0912c7ef9"} Feb 17 00:28:56 crc kubenswrapper[4791]: I0217 00:28:56.668624 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-snmp-webhook-6856cfb745-z8m2z" podStartSLOduration=2.135212671 podStartE2EDuration="10.66860632s" podCreationTimestamp="2026-02-17 00:28:46 +0000 UTC" firstStartedPulling="2026-02-17 00:28:47.09007319 +0000 UTC m=+1384.569585727" lastFinishedPulling="2026-02-17 00:28:55.623466849 +0000 UTC m=+1393.102979376" observedRunningTime="2026-02-17 00:28:56.665496483 +0000 UTC m=+1394.145009020" watchObservedRunningTime="2026-02-17 00:28:56.66860632 +0000 UTC m=+1394.148118847" Feb 17 00:28:58 crc kubenswrapper[4791]: I0217 00:28:58.658531 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"df247d19-621c-4c9b-a436-d4f263dcb5ae","Type":"ContainerStarted","Data":"998ef5a14e6e2a31346916c64f98b39fb611f2c22ab183eb500881c5f701baa9"} Feb 17 00:29:00 crc kubenswrapper[4791]: I0217 00:29:00.677582 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9","Type":"ContainerStarted","Data":"3753dc97d2d9a9f9c377ac877c2152f4b3ae9f33fdfea70a5940be95ade659c6"} Feb 17 00:29:03 crc kubenswrapper[4791]: I0217 00:29:03.599636 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8"] Feb 17 00:29:03 crc kubenswrapper[4791]: I0217 00:29:03.602572 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" Feb 17 00:29:03 crc kubenswrapper[4791]: I0217 00:29:03.607701 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-session-secret" Feb 17 00:29:03 crc kubenswrapper[4791]: I0217 00:29:03.607718 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-coll-meter-sg-core-configmap" Feb 17 00:29:03 crc kubenswrapper[4791]: I0217 00:29:03.607767 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-coll-meter-proxy-tls" Feb 17 00:29:03 crc kubenswrapper[4791]: I0217 00:29:03.616405 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-dockercfg-dwsz6" Feb 17 00:29:03 crc kubenswrapper[4791]: I0217 00:29:03.626765 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8"] Feb 17 00:29:03 crc kubenswrapper[4791]: I0217 00:29:03.690954 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/0d7ac416-17e0-4f86-8786-0afdec7fc240-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8\" (UID: \"0d7ac416-17e0-4f86-8786-0afdec7fc240\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" Feb 17 00:29:03 crc kubenswrapper[4791]: I0217 00:29:03.691073 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/0d7ac416-17e0-4f86-8786-0afdec7fc240-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8\" (UID: \"0d7ac416-17e0-4f86-8786-0afdec7fc240\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" Feb 17 00:29:03 crc kubenswrapper[4791]: I0217 00:29:03.691152 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/0d7ac416-17e0-4f86-8786-0afdec7fc240-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8\" (UID: \"0d7ac416-17e0-4f86-8786-0afdec7fc240\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" Feb 17 00:29:03 crc kubenswrapper[4791]: I0217 00:29:03.691388 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ksqx\" (UniqueName: \"kubernetes.io/projected/0d7ac416-17e0-4f86-8786-0afdec7fc240-kube-api-access-6ksqx\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8\" (UID: \"0d7ac416-17e0-4f86-8786-0afdec7fc240\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" Feb 17 00:29:03 crc kubenswrapper[4791]: I0217 00:29:03.691424 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/0d7ac416-17e0-4f86-8786-0afdec7fc240-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8\" (UID: \"0d7ac416-17e0-4f86-8786-0afdec7fc240\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" Feb 17 00:29:03 crc kubenswrapper[4791]: I0217 00:29:03.715851 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9","Type":"ContainerStarted","Data":"e4daa4b3e8564fa8acd4bcf94b8ae100cc65c2fd1c2aaded03771f6a80791e1b"} Feb 17 00:29:03 crc kubenswrapper[4791]: I0217 00:29:03.724242 4791 generic.go:334] "Generic (PLEG): container finished" podID="df247d19-621c-4c9b-a436-d4f263dcb5ae" containerID="998ef5a14e6e2a31346916c64f98b39fb611f2c22ab183eb500881c5f701baa9" exitCode=0 Feb 17 00:29:03 crc kubenswrapper[4791]: I0217 00:29:03.724308 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"df247d19-621c-4c9b-a436-d4f263dcb5ae","Type":"ContainerDied","Data":"998ef5a14e6e2a31346916c64f98b39fb611f2c22ab183eb500881c5f701baa9"} Feb 17 00:29:03 crc kubenswrapper[4791]: I0217 00:29:03.793300 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ksqx\" (UniqueName: \"kubernetes.io/projected/0d7ac416-17e0-4f86-8786-0afdec7fc240-kube-api-access-6ksqx\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8\" (UID: \"0d7ac416-17e0-4f86-8786-0afdec7fc240\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" Feb 17 00:29:03 crc kubenswrapper[4791]: I0217 00:29:03.793370 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/0d7ac416-17e0-4f86-8786-0afdec7fc240-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8\" (UID: \"0d7ac416-17e0-4f86-8786-0afdec7fc240\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" Feb 17 00:29:03 crc kubenswrapper[4791]: I0217 00:29:03.793433 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/0d7ac416-17e0-4f86-8786-0afdec7fc240-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8\" (UID: \"0d7ac416-17e0-4f86-8786-0afdec7fc240\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" Feb 17 00:29:03 crc kubenswrapper[4791]: I0217 00:29:03.793460 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/0d7ac416-17e0-4f86-8786-0afdec7fc240-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8\" (UID: \"0d7ac416-17e0-4f86-8786-0afdec7fc240\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" Feb 17 00:29:03 crc kubenswrapper[4791]: I0217 00:29:03.793531 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/0d7ac416-17e0-4f86-8786-0afdec7fc240-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8\" (UID: \"0d7ac416-17e0-4f86-8786-0afdec7fc240\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" Feb 17 00:29:03 crc kubenswrapper[4791]: E0217 00:29:03.794784 4791 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-coll-meter-proxy-tls: secret "default-cloud1-coll-meter-proxy-tls" not found Feb 17 00:29:03 crc kubenswrapper[4791]: E0217 00:29:03.794851 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d7ac416-17e0-4f86-8786-0afdec7fc240-default-cloud1-coll-meter-proxy-tls podName:0d7ac416-17e0-4f86-8786-0afdec7fc240 nodeName:}" failed. No retries permitted until 2026-02-17 00:29:04.294828872 +0000 UTC m=+1401.774341399 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-coll-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/0d7ac416-17e0-4f86-8786-0afdec7fc240-default-cloud1-coll-meter-proxy-tls") pod "default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" (UID: "0d7ac416-17e0-4f86-8786-0afdec7fc240") : secret "default-cloud1-coll-meter-proxy-tls" not found Feb 17 00:29:03 crc kubenswrapper[4791]: I0217 00:29:03.795735 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/0d7ac416-17e0-4f86-8786-0afdec7fc240-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8\" (UID: \"0d7ac416-17e0-4f86-8786-0afdec7fc240\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" Feb 17 00:29:03 crc kubenswrapper[4791]: I0217 00:29:03.796411 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/0d7ac416-17e0-4f86-8786-0afdec7fc240-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8\" (UID: \"0d7ac416-17e0-4f86-8786-0afdec7fc240\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" Feb 17 00:29:03 crc kubenswrapper[4791]: I0217 00:29:03.820452 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/0d7ac416-17e0-4f86-8786-0afdec7fc240-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8\" (UID: \"0d7ac416-17e0-4f86-8786-0afdec7fc240\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" Feb 17 00:29:03 crc kubenswrapper[4791]: I0217 00:29:03.821672 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ksqx\" (UniqueName: \"kubernetes.io/projected/0d7ac416-17e0-4f86-8786-0afdec7fc240-kube-api-access-6ksqx\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8\" (UID: \"0d7ac416-17e0-4f86-8786-0afdec7fc240\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" Feb 17 00:29:04 crc kubenswrapper[4791]: I0217 00:29:04.300078 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/0d7ac416-17e0-4f86-8786-0afdec7fc240-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8\" (UID: \"0d7ac416-17e0-4f86-8786-0afdec7fc240\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" Feb 17 00:29:04 crc kubenswrapper[4791]: E0217 00:29:04.300254 4791 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-coll-meter-proxy-tls: secret "default-cloud1-coll-meter-proxy-tls" not found Feb 17 00:29:04 crc kubenswrapper[4791]: E0217 00:29:04.300346 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d7ac416-17e0-4f86-8786-0afdec7fc240-default-cloud1-coll-meter-proxy-tls podName:0d7ac416-17e0-4f86-8786-0afdec7fc240 nodeName:}" failed. No retries permitted until 2026-02-17 00:29:05.300324157 +0000 UTC m=+1402.779836684 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-coll-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/0d7ac416-17e0-4f86-8786-0afdec7fc240-default-cloud1-coll-meter-proxy-tls") pod "default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" (UID: "0d7ac416-17e0-4f86-8786-0afdec7fc240") : secret "default-cloud1-coll-meter-proxy-tls" not found Feb 17 00:29:05 crc kubenswrapper[4791]: I0217 00:29:05.316063 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/0d7ac416-17e0-4f86-8786-0afdec7fc240-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8\" (UID: \"0d7ac416-17e0-4f86-8786-0afdec7fc240\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" Feb 17 00:29:05 crc kubenswrapper[4791]: I0217 00:29:05.329233 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/0d7ac416-17e0-4f86-8786-0afdec7fc240-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8\" (UID: \"0d7ac416-17e0-4f86-8786-0afdec7fc240\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" Feb 17 00:29:05 crc kubenswrapper[4791]: I0217 00:29:05.423279 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" Feb 17 00:29:06 crc kubenswrapper[4791]: I0217 00:29:06.232925 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9"] Feb 17 00:29:06 crc kubenswrapper[4791]: I0217 00:29:06.234231 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" Feb 17 00:29:06 crc kubenswrapper[4791]: I0217 00:29:06.236956 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-ceil-meter-proxy-tls" Feb 17 00:29:06 crc kubenswrapper[4791]: I0217 00:29:06.236954 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-ceil-meter-sg-core-configmap" Feb 17 00:29:06 crc kubenswrapper[4791]: I0217 00:29:06.245209 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9"] Feb 17 00:29:06 crc kubenswrapper[4791]: I0217 00:29:06.430648 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/e3d9a725-2f9c-4fcc-8610-4b297a3d689d-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9\" (UID: \"e3d9a725-2f9c-4fcc-8610-4b297a3d689d\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" Feb 17 00:29:06 crc kubenswrapper[4791]: I0217 00:29:06.430764 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/e3d9a725-2f9c-4fcc-8610-4b297a3d689d-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9\" (UID: \"e3d9a725-2f9c-4fcc-8610-4b297a3d689d\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" Feb 17 00:29:06 crc kubenswrapper[4791]: I0217 00:29:06.430811 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/e3d9a725-2f9c-4fcc-8610-4b297a3d689d-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9\" (UID: \"e3d9a725-2f9c-4fcc-8610-4b297a3d689d\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" Feb 17 00:29:06 crc kubenswrapper[4791]: I0217 00:29:06.430843 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-865kq\" (UniqueName: \"kubernetes.io/projected/e3d9a725-2f9c-4fcc-8610-4b297a3d689d-kube-api-access-865kq\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9\" (UID: \"e3d9a725-2f9c-4fcc-8610-4b297a3d689d\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" Feb 17 00:29:06 crc kubenswrapper[4791]: I0217 00:29:06.430953 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/e3d9a725-2f9c-4fcc-8610-4b297a3d689d-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9\" (UID: \"e3d9a725-2f9c-4fcc-8610-4b297a3d689d\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" Feb 17 00:29:06 crc kubenswrapper[4791]: I0217 00:29:06.532711 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/e3d9a725-2f9c-4fcc-8610-4b297a3d689d-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9\" (UID: \"e3d9a725-2f9c-4fcc-8610-4b297a3d689d\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" Feb 17 00:29:06 crc kubenswrapper[4791]: I0217 00:29:06.532773 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/e3d9a725-2f9c-4fcc-8610-4b297a3d689d-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9\" (UID: \"e3d9a725-2f9c-4fcc-8610-4b297a3d689d\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" Feb 17 00:29:06 crc kubenswrapper[4791]: I0217 00:29:06.532797 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/e3d9a725-2f9c-4fcc-8610-4b297a3d689d-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9\" (UID: \"e3d9a725-2f9c-4fcc-8610-4b297a3d689d\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" Feb 17 00:29:06 crc kubenswrapper[4791]: I0217 00:29:06.532819 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-865kq\" (UniqueName: \"kubernetes.io/projected/e3d9a725-2f9c-4fcc-8610-4b297a3d689d-kube-api-access-865kq\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9\" (UID: \"e3d9a725-2f9c-4fcc-8610-4b297a3d689d\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" Feb 17 00:29:06 crc kubenswrapper[4791]: I0217 00:29:06.532850 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/e3d9a725-2f9c-4fcc-8610-4b297a3d689d-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9\" (UID: \"e3d9a725-2f9c-4fcc-8610-4b297a3d689d\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" Feb 17 00:29:06 crc kubenswrapper[4791]: E0217 00:29:06.532886 4791 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-ceil-meter-proxy-tls: secret "default-cloud1-ceil-meter-proxy-tls" not found Feb 17 00:29:06 crc kubenswrapper[4791]: E0217 00:29:06.532957 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3d9a725-2f9c-4fcc-8610-4b297a3d689d-default-cloud1-ceil-meter-proxy-tls podName:e3d9a725-2f9c-4fcc-8610-4b297a3d689d nodeName:}" failed. No retries permitted until 2026-02-17 00:29:07.032940021 +0000 UTC m=+1404.512452538 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-ceil-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/e3d9a725-2f9c-4fcc-8610-4b297a3d689d-default-cloud1-ceil-meter-proxy-tls") pod "default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" (UID: "e3d9a725-2f9c-4fcc-8610-4b297a3d689d") : secret "default-cloud1-ceil-meter-proxy-tls" not found Feb 17 00:29:06 crc kubenswrapper[4791]: I0217 00:29:06.533902 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/e3d9a725-2f9c-4fcc-8610-4b297a3d689d-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9\" (UID: \"e3d9a725-2f9c-4fcc-8610-4b297a3d689d\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" Feb 17 00:29:06 crc kubenswrapper[4791]: I0217 00:29:06.535057 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/e3d9a725-2f9c-4fcc-8610-4b297a3d689d-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9\" (UID: \"e3d9a725-2f9c-4fcc-8610-4b297a3d689d\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" Feb 17 00:29:06 crc kubenswrapper[4791]: I0217 00:29:06.537248 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/e3d9a725-2f9c-4fcc-8610-4b297a3d689d-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9\" (UID: \"e3d9a725-2f9c-4fcc-8610-4b297a3d689d\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" Feb 17 00:29:06 crc kubenswrapper[4791]: I0217 00:29:06.561571 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-865kq\" (UniqueName: \"kubernetes.io/projected/e3d9a725-2f9c-4fcc-8610-4b297a3d689d-kube-api-access-865kq\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9\" (UID: \"e3d9a725-2f9c-4fcc-8610-4b297a3d689d\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" Feb 17 00:29:07 crc kubenswrapper[4791]: I0217 00:29:07.039182 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/e3d9a725-2f9c-4fcc-8610-4b297a3d689d-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9\" (UID: \"e3d9a725-2f9c-4fcc-8610-4b297a3d689d\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" Feb 17 00:29:07 crc kubenswrapper[4791]: E0217 00:29:07.039289 4791 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-ceil-meter-proxy-tls: secret "default-cloud1-ceil-meter-proxy-tls" not found Feb 17 00:29:07 crc kubenswrapper[4791]: E0217 00:29:07.039392 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3d9a725-2f9c-4fcc-8610-4b297a3d689d-default-cloud1-ceil-meter-proxy-tls podName:e3d9a725-2f9c-4fcc-8610-4b297a3d689d nodeName:}" failed. No retries permitted until 2026-02-17 00:29:08.039373945 +0000 UTC m=+1405.518886472 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-ceil-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/e3d9a725-2f9c-4fcc-8610-4b297a3d689d-default-cloud1-ceil-meter-proxy-tls") pod "default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" (UID: "e3d9a725-2f9c-4fcc-8610-4b297a3d689d") : secret "default-cloud1-ceil-meter-proxy-tls" not found Feb 17 00:29:08 crc kubenswrapper[4791]: I0217 00:29:08.050679 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/e3d9a725-2f9c-4fcc-8610-4b297a3d689d-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9\" (UID: \"e3d9a725-2f9c-4fcc-8610-4b297a3d689d\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" Feb 17 00:29:08 crc kubenswrapper[4791]: I0217 00:29:08.059023 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/e3d9a725-2f9c-4fcc-8610-4b297a3d689d-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9\" (UID: \"e3d9a725-2f9c-4fcc-8610-4b297a3d689d\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" Feb 17 00:29:08 crc kubenswrapper[4791]: I0217 00:29:08.356161 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" Feb 17 00:29:09 crc kubenswrapper[4791]: I0217 00:29:09.443348 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8"] Feb 17 00:29:09 crc kubenswrapper[4791]: I0217 00:29:09.946289 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p"] Feb 17 00:29:09 crc kubenswrapper[4791]: I0217 00:29:09.949223 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" Feb 17 00:29:09 crc kubenswrapper[4791]: I0217 00:29:09.951323 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-sens-meter-proxy-tls" Feb 17 00:29:09 crc kubenswrapper[4791]: I0217 00:29:09.951853 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-sens-meter-sg-core-configmap" Feb 17 00:29:09 crc kubenswrapper[4791]: I0217 00:29:09.965271 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p"] Feb 17 00:29:10 crc kubenswrapper[4791]: I0217 00:29:10.084577 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/6e985bca-5f71-47e0-bd63-ede2ad79bd7e-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p\" (UID: \"6e985bca-5f71-47e0-bd63-ede2ad79bd7e\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" Feb 17 00:29:10 crc kubenswrapper[4791]: I0217 00:29:10.084631 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/6e985bca-5f71-47e0-bd63-ede2ad79bd7e-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p\" (UID: \"6e985bca-5f71-47e0-bd63-ede2ad79bd7e\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" Feb 17 00:29:10 crc kubenswrapper[4791]: I0217 00:29:10.084681 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/6e985bca-5f71-47e0-bd63-ede2ad79bd7e-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p\" (UID: \"6e985bca-5f71-47e0-bd63-ede2ad79bd7e\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" Feb 17 00:29:10 crc kubenswrapper[4791]: I0217 00:29:10.084708 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e985bca-5f71-47e0-bd63-ede2ad79bd7e-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p\" (UID: \"6e985bca-5f71-47e0-bd63-ede2ad79bd7e\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" Feb 17 00:29:10 crc kubenswrapper[4791]: I0217 00:29:10.084730 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh2wc\" (UniqueName: \"kubernetes.io/projected/6e985bca-5f71-47e0-bd63-ede2ad79bd7e-kube-api-access-nh2wc\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p\" (UID: \"6e985bca-5f71-47e0-bd63-ede2ad79bd7e\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" Feb 17 00:29:10 crc kubenswrapper[4791]: I0217 00:29:10.186309 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/6e985bca-5f71-47e0-bd63-ede2ad79bd7e-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p\" (UID: \"6e985bca-5f71-47e0-bd63-ede2ad79bd7e\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" Feb 17 00:29:10 crc kubenswrapper[4791]: I0217 00:29:10.186360 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e985bca-5f71-47e0-bd63-ede2ad79bd7e-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p\" (UID: \"6e985bca-5f71-47e0-bd63-ede2ad79bd7e\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" Feb 17 00:29:10 crc kubenswrapper[4791]: I0217 00:29:10.186395 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh2wc\" (UniqueName: \"kubernetes.io/projected/6e985bca-5f71-47e0-bd63-ede2ad79bd7e-kube-api-access-nh2wc\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p\" (UID: \"6e985bca-5f71-47e0-bd63-ede2ad79bd7e\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" Feb 17 00:29:10 crc kubenswrapper[4791]: I0217 00:29:10.186477 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/6e985bca-5f71-47e0-bd63-ede2ad79bd7e-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p\" (UID: \"6e985bca-5f71-47e0-bd63-ede2ad79bd7e\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" Feb 17 00:29:10 crc kubenswrapper[4791]: I0217 00:29:10.186506 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/6e985bca-5f71-47e0-bd63-ede2ad79bd7e-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p\" (UID: \"6e985bca-5f71-47e0-bd63-ede2ad79bd7e\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" Feb 17 00:29:10 crc kubenswrapper[4791]: E0217 00:29:10.186562 4791 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-sens-meter-proxy-tls: secret "default-cloud1-sens-meter-proxy-tls" not found Feb 17 00:29:10 crc kubenswrapper[4791]: E0217 00:29:10.186932 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e985bca-5f71-47e0-bd63-ede2ad79bd7e-default-cloud1-sens-meter-proxy-tls podName:6e985bca-5f71-47e0-bd63-ede2ad79bd7e nodeName:}" failed. No retries permitted until 2026-02-17 00:29:10.686912044 +0000 UTC m=+1408.166424561 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-sens-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/6e985bca-5f71-47e0-bd63-ede2ad79bd7e-default-cloud1-sens-meter-proxy-tls") pod "default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" (UID: "6e985bca-5f71-47e0-bd63-ede2ad79bd7e") : secret "default-cloud1-sens-meter-proxy-tls" not found Feb 17 00:29:10 crc kubenswrapper[4791]: I0217 00:29:10.187896 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/6e985bca-5f71-47e0-bd63-ede2ad79bd7e-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p\" (UID: \"6e985bca-5f71-47e0-bd63-ede2ad79bd7e\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" Feb 17 00:29:10 crc kubenswrapper[4791]: I0217 00:29:10.188249 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/6e985bca-5f71-47e0-bd63-ede2ad79bd7e-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p\" (UID: \"6e985bca-5f71-47e0-bd63-ede2ad79bd7e\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" Feb 17 00:29:10 crc kubenswrapper[4791]: I0217 00:29:10.193876 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/6e985bca-5f71-47e0-bd63-ede2ad79bd7e-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p\" (UID: \"6e985bca-5f71-47e0-bd63-ede2ad79bd7e\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" Feb 17 00:29:10 crc kubenswrapper[4791]: I0217 00:29:10.222872 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh2wc\" (UniqueName: \"kubernetes.io/projected/6e985bca-5f71-47e0-bd63-ede2ad79bd7e-kube-api-access-nh2wc\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p\" (UID: \"6e985bca-5f71-47e0-bd63-ede2ad79bd7e\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" Feb 17 00:29:10 crc kubenswrapper[4791]: I0217 00:29:10.694366 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e985bca-5f71-47e0-bd63-ede2ad79bd7e-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p\" (UID: \"6e985bca-5f71-47e0-bd63-ede2ad79bd7e\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" Feb 17 00:29:10 crc kubenswrapper[4791]: E0217 00:29:10.694577 4791 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-sens-meter-proxy-tls: secret "default-cloud1-sens-meter-proxy-tls" not found Feb 17 00:29:10 crc kubenswrapper[4791]: E0217 00:29:10.694878 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e985bca-5f71-47e0-bd63-ede2ad79bd7e-default-cloud1-sens-meter-proxy-tls podName:6e985bca-5f71-47e0-bd63-ede2ad79bd7e nodeName:}" failed. No retries permitted until 2026-02-17 00:29:11.694859906 +0000 UTC m=+1409.174372433 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-sens-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/6e985bca-5f71-47e0-bd63-ede2ad79bd7e-default-cloud1-sens-meter-proxy-tls") pod "default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" (UID: "6e985bca-5f71-47e0-bd63-ede2ad79bd7e") : secret "default-cloud1-sens-meter-proxy-tls" not found Feb 17 00:29:11 crc kubenswrapper[4791]: I0217 00:29:11.279732 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9"] Feb 17 00:29:11 crc kubenswrapper[4791]: W0217 00:29:11.293292 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3d9a725_2f9c_4fcc_8610_4b297a3d689d.slice/crio-607da8697ab8119c957c44b118ef4d1b97a48836c0ae10a133011a6fb7b565d5 WatchSource:0}: Error finding container 607da8697ab8119c957c44b118ef4d1b97a48836c0ae10a133011a6fb7b565d5: Status 404 returned error can't find the container with id 607da8697ab8119c957c44b118ef4d1b97a48836c0ae10a133011a6fb7b565d5 Feb 17 00:29:11 crc kubenswrapper[4791]: I0217 00:29:11.714168 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e985bca-5f71-47e0-bd63-ede2ad79bd7e-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p\" (UID: \"6e985bca-5f71-47e0-bd63-ede2ad79bd7e\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" Feb 17 00:29:11 crc kubenswrapper[4791]: I0217 00:29:11.724828 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e985bca-5f71-47e0-bd63-ede2ad79bd7e-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p\" (UID: \"6e985bca-5f71-47e0-bd63-ede2ad79bd7e\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" Feb 17 00:29:11 crc kubenswrapper[4791]: I0217 00:29:11.765672 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" Feb 17 00:29:11 crc kubenswrapper[4791]: I0217 00:29:11.786225 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9","Type":"ContainerStarted","Data":"37c810ec5bee67125f7768518146935329d76039df0ea1422f1a62c24a4c0161"} Feb 17 00:29:11 crc kubenswrapper[4791]: I0217 00:29:11.787226 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" event={"ID":"0d7ac416-17e0-4f86-8786-0afdec7fc240","Type":"ContainerStarted","Data":"2ea0b480a8247883d24e5aded33de0fdcbf8948ea1f81f572b1d93ba31c2ef4b"} Feb 17 00:29:11 crc kubenswrapper[4791]: I0217 00:29:11.789505 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"df247d19-621c-4c9b-a436-d4f263dcb5ae","Type":"ContainerStarted","Data":"84e468c8b653dc5b490fb7c9a38b4cd72c4b7f05c8279923819b2ccb0540039c"} Feb 17 00:29:11 crc kubenswrapper[4791]: I0217 00:29:11.790956 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" event={"ID":"e3d9a725-2f9c-4fcc-8610-4b297a3d689d","Type":"ContainerStarted","Data":"607da8697ab8119c957c44b118ef4d1b97a48836c0ae10a133011a6fb7b565d5"} Feb 17 00:29:12 crc kubenswrapper[4791]: I0217 00:29:12.242196 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-default-0" podStartSLOduration=5.127527702 podStartE2EDuration="37.242176088s" podCreationTimestamp="2026-02-17 00:28:35 +0000 UTC" firstStartedPulling="2026-02-17 00:28:38.899955134 +0000 UTC m=+1376.379467661" lastFinishedPulling="2026-02-17 00:29:11.01460352 +0000 UTC m=+1408.494116047" observedRunningTime="2026-02-17 00:29:11.824917255 +0000 UTC m=+1409.304429782" watchObservedRunningTime="2026-02-17 00:29:12.242176088 +0000 UTC m=+1409.721688635" Feb 17 00:29:12 crc kubenswrapper[4791]: I0217 00:29:12.245867 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p"] Feb 17 00:29:12 crc kubenswrapper[4791]: W0217 00:29:12.277386 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e985bca_5f71_47e0_bd63_ede2ad79bd7e.slice/crio-faf1194849a6b08f2342300cdeb5f6df3eb6d76cc07d1b36f2d2fcff087e07be WatchSource:0}: Error finding container faf1194849a6b08f2342300cdeb5f6df3eb6d76cc07d1b36f2d2fcff087e07be: Status 404 returned error can't find the container with id faf1194849a6b08f2342300cdeb5f6df3eb6d76cc07d1b36f2d2fcff087e07be Feb 17 00:29:12 crc kubenswrapper[4791]: I0217 00:29:12.797976 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" event={"ID":"6e985bca-5f71-47e0-bd63-ede2ad79bd7e","Type":"ContainerStarted","Data":"faf1194849a6b08f2342300cdeb5f6df3eb6d76cc07d1b36f2d2fcff087e07be"} Feb 17 00:29:12 crc kubenswrapper[4791]: I0217 00:29:12.800123 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" event={"ID":"0d7ac416-17e0-4f86-8786-0afdec7fc240","Type":"ContainerStarted","Data":"0bcd8fd838c425b802a4819261a744dba4ce449654ec0a0fe6ae473c08cdeba1"} Feb 17 00:29:13 crc kubenswrapper[4791]: I0217 00:29:13.439991 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/prometheus-default-0" Feb 17 00:29:13 crc kubenswrapper[4791]: I0217 00:29:13.807596 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" event={"ID":"e3d9a725-2f9c-4fcc-8610-4b297a3d689d","Type":"ContainerStarted","Data":"3d3e6e5f6fae0a1d9bc95cd8d0ec2bdd51919a431ec36da1b9ad69643dd5bb68"} Feb 17 00:29:13 crc kubenswrapper[4791]: I0217 00:29:13.810258 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"df247d19-621c-4c9b-a436-d4f263dcb5ae","Type":"ContainerStarted","Data":"153265f174b4a9d401be05ef4b3d5074ecb8cffeda6cb6fb5a3745907cbc94f9"} Feb 17 00:29:16 crc kubenswrapper[4791]: I0217 00:29:16.833342 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" event={"ID":"6e985bca-5f71-47e0-bd63-ede2ad79bd7e","Type":"ContainerStarted","Data":"06984239a9a3196958fafabce1efbb7339596d15d505f88825db83923ee3bd54"} Feb 17 00:29:16 crc kubenswrapper[4791]: I0217 00:29:16.836049 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"df247d19-621c-4c9b-a436-d4f263dcb5ae","Type":"ContainerStarted","Data":"b1e6bba6a1270fae1a1dedb13487801a2467c60de6bdb21dae1122ce5b716e1d"} Feb 17 00:29:16 crc kubenswrapper[4791]: I0217 00:29:16.859349 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/alertmanager-default-0" podStartSLOduration=15.407657542 podStartE2EDuration="27.859333717s" podCreationTimestamp="2026-02-17 00:28:49 +0000 UTC" firstStartedPulling="2026-02-17 00:29:03.730302547 +0000 UTC m=+1401.209815074" lastFinishedPulling="2026-02-17 00:29:16.181978722 +0000 UTC m=+1413.661491249" observedRunningTime="2026-02-17 00:29:16.857095787 +0000 UTC m=+1414.336608314" watchObservedRunningTime="2026-02-17 00:29:16.859333717 +0000 UTC m=+1414.338846244" Feb 17 00:29:17 crc kubenswrapper[4791]: I0217 00:29:17.564608 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk"] Feb 17 00:29:17 crc kubenswrapper[4791]: I0217 00:29:17.565690 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk" Feb 17 00:29:17 crc kubenswrapper[4791]: I0217 00:29:17.569260 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-coll-event-sg-core-configmap" Feb 17 00:29:17 crc kubenswrapper[4791]: I0217 00:29:17.569392 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-cert" Feb 17 00:29:17 crc kubenswrapper[4791]: I0217 00:29:17.584164 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk"] Feb 17 00:29:17 crc kubenswrapper[4791]: I0217 00:29:17.701042 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/4634db00-8e3c-4569-b66c-ef549eda9204-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk\" (UID: \"4634db00-8e3c-4569-b66c-ef549eda9204\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk" Feb 17 00:29:17 crc kubenswrapper[4791]: I0217 00:29:17.701105 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rs6n\" (UniqueName: \"kubernetes.io/projected/4634db00-8e3c-4569-b66c-ef549eda9204-kube-api-access-6rs6n\") pod \"default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk\" (UID: \"4634db00-8e3c-4569-b66c-ef549eda9204\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk" Feb 17 00:29:17 crc kubenswrapper[4791]: I0217 00:29:17.701462 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/4634db00-8e3c-4569-b66c-ef549eda9204-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk\" (UID: \"4634db00-8e3c-4569-b66c-ef549eda9204\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk" Feb 17 00:29:17 crc kubenswrapper[4791]: I0217 00:29:17.701632 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/4634db00-8e3c-4569-b66c-ef549eda9204-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk\" (UID: \"4634db00-8e3c-4569-b66c-ef549eda9204\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk" Feb 17 00:29:17 crc kubenswrapper[4791]: I0217 00:29:17.803506 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/4634db00-8e3c-4569-b66c-ef549eda9204-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk\" (UID: \"4634db00-8e3c-4569-b66c-ef549eda9204\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk" Feb 17 00:29:17 crc kubenswrapper[4791]: I0217 00:29:17.803903 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/4634db00-8e3c-4569-b66c-ef549eda9204-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk\" (UID: \"4634db00-8e3c-4569-b66c-ef549eda9204\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk" Feb 17 00:29:17 crc kubenswrapper[4791]: I0217 00:29:17.803938 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rs6n\" (UniqueName: \"kubernetes.io/projected/4634db00-8e3c-4569-b66c-ef549eda9204-kube-api-access-6rs6n\") pod \"default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk\" (UID: \"4634db00-8e3c-4569-b66c-ef549eda9204\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk" Feb 17 00:29:17 crc kubenswrapper[4791]: I0217 00:29:17.804031 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/4634db00-8e3c-4569-b66c-ef549eda9204-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk\" (UID: \"4634db00-8e3c-4569-b66c-ef549eda9204\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk" Feb 17 00:29:17 crc kubenswrapper[4791]: I0217 00:29:17.804600 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/4634db00-8e3c-4569-b66c-ef549eda9204-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk\" (UID: \"4634db00-8e3c-4569-b66c-ef549eda9204\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk" Feb 17 00:29:17 crc kubenswrapper[4791]: I0217 00:29:17.804684 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/4634db00-8e3c-4569-b66c-ef549eda9204-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk\" (UID: \"4634db00-8e3c-4569-b66c-ef549eda9204\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk" Feb 17 00:29:17 crc kubenswrapper[4791]: I0217 00:29:17.826046 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/4634db00-8e3c-4569-b66c-ef549eda9204-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk\" (UID: \"4634db00-8e3c-4569-b66c-ef549eda9204\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk" Feb 17 00:29:17 crc kubenswrapper[4791]: I0217 00:29:17.826773 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rs6n\" (UniqueName: \"kubernetes.io/projected/4634db00-8e3c-4569-b66c-ef549eda9204-kube-api-access-6rs6n\") pod \"default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk\" (UID: \"4634db00-8e3c-4569-b66c-ef549eda9204\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk" Feb 17 00:29:17 crc kubenswrapper[4791]: I0217 00:29:17.844054 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" event={"ID":"0d7ac416-17e0-4f86-8786-0afdec7fc240","Type":"ContainerStarted","Data":"74d069a0a7b95b2c13b5d7774c37eff82d25f60a2056b32e4a8abff250caf0b5"} Feb 17 00:29:17 crc kubenswrapper[4791]: I0217 00:29:17.847210 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" event={"ID":"e3d9a725-2f9c-4fcc-8610-4b297a3d689d","Type":"ContainerStarted","Data":"ac04f34d07519fe9dd3a66842e1d7ce4acf0bcb7a4067dd4c43b6fa708598d8c"} Feb 17 00:29:17 crc kubenswrapper[4791]: I0217 00:29:17.887135 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk" Feb 17 00:29:18 crc kubenswrapper[4791]: I0217 00:29:18.372973 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk"] Feb 17 00:29:18 crc kubenswrapper[4791]: I0217 00:29:18.503853 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m"] Feb 17 00:29:18 crc kubenswrapper[4791]: I0217 00:29:18.505039 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m" Feb 17 00:29:18 crc kubenswrapper[4791]: I0217 00:29:18.507537 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-ceil-event-sg-core-configmap" Feb 17 00:29:18 crc kubenswrapper[4791]: I0217 00:29:18.521146 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m"] Feb 17 00:29:18 crc kubenswrapper[4791]: I0217 00:29:18.616841 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/21f15fa5-1f89-4aae-b6df-6a7c33630f43-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m\" (UID: \"21f15fa5-1f89-4aae-b6df-6a7c33630f43\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m" Feb 17 00:29:18 crc kubenswrapper[4791]: I0217 00:29:18.616903 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/21f15fa5-1f89-4aae-b6df-6a7c33630f43-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m\" (UID: \"21f15fa5-1f89-4aae-b6df-6a7c33630f43\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m" Feb 17 00:29:18 crc kubenswrapper[4791]: I0217 00:29:18.616920 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/21f15fa5-1f89-4aae-b6df-6a7c33630f43-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m\" (UID: \"21f15fa5-1f89-4aae-b6df-6a7c33630f43\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m" Feb 17 00:29:18 crc kubenswrapper[4791]: I0217 00:29:18.617028 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nppcx\" (UniqueName: \"kubernetes.io/projected/21f15fa5-1f89-4aae-b6df-6a7c33630f43-kube-api-access-nppcx\") pod \"default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m\" (UID: \"21f15fa5-1f89-4aae-b6df-6a7c33630f43\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m" Feb 17 00:29:18 crc kubenswrapper[4791]: I0217 00:29:18.721709 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nppcx\" (UniqueName: \"kubernetes.io/projected/21f15fa5-1f89-4aae-b6df-6a7c33630f43-kube-api-access-nppcx\") pod \"default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m\" (UID: \"21f15fa5-1f89-4aae-b6df-6a7c33630f43\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m" Feb 17 00:29:18 crc kubenswrapper[4791]: I0217 00:29:18.721842 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/21f15fa5-1f89-4aae-b6df-6a7c33630f43-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m\" (UID: \"21f15fa5-1f89-4aae-b6df-6a7c33630f43\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m" Feb 17 00:29:18 crc kubenswrapper[4791]: I0217 00:29:18.721888 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/21f15fa5-1f89-4aae-b6df-6a7c33630f43-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m\" (UID: \"21f15fa5-1f89-4aae-b6df-6a7c33630f43\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m" Feb 17 00:29:18 crc kubenswrapper[4791]: I0217 00:29:18.721940 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/21f15fa5-1f89-4aae-b6df-6a7c33630f43-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m\" (UID: \"21f15fa5-1f89-4aae-b6df-6a7c33630f43\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m" Feb 17 00:29:18 crc kubenswrapper[4791]: I0217 00:29:18.723783 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/21f15fa5-1f89-4aae-b6df-6a7c33630f43-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m\" (UID: \"21f15fa5-1f89-4aae-b6df-6a7c33630f43\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m" Feb 17 00:29:18 crc kubenswrapper[4791]: I0217 00:29:18.724298 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/21f15fa5-1f89-4aae-b6df-6a7c33630f43-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m\" (UID: \"21f15fa5-1f89-4aae-b6df-6a7c33630f43\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m" Feb 17 00:29:18 crc kubenswrapper[4791]: I0217 00:29:18.728809 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/21f15fa5-1f89-4aae-b6df-6a7c33630f43-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m\" (UID: \"21f15fa5-1f89-4aae-b6df-6a7c33630f43\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m" Feb 17 00:29:18 crc kubenswrapper[4791]: I0217 00:29:18.741687 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nppcx\" (UniqueName: \"kubernetes.io/projected/21f15fa5-1f89-4aae-b6df-6a7c33630f43-kube-api-access-nppcx\") pod \"default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m\" (UID: \"21f15fa5-1f89-4aae-b6df-6a7c33630f43\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m" Feb 17 00:29:18 crc kubenswrapper[4791]: I0217 00:29:18.828721 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m" Feb 17 00:29:18 crc kubenswrapper[4791]: I0217 00:29:18.856445 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk" event={"ID":"4634db00-8e3c-4569-b66c-ef549eda9204","Type":"ContainerStarted","Data":"bd8e5e049c1bd26e9cbfcd750d20fa2c47955c830e7d64eaa2e70221b3dbd377"} Feb 17 00:29:18 crc kubenswrapper[4791]: I0217 00:29:18.856482 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk" event={"ID":"4634db00-8e3c-4569-b66c-ef549eda9204","Type":"ContainerStarted","Data":"8179bd87591c1f9881dc2de0c592243d6bc718cdd78eb7e8596a2b1a06322d81"} Feb 17 00:29:18 crc kubenswrapper[4791]: I0217 00:29:18.861253 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" event={"ID":"6e985bca-5f71-47e0-bd63-ede2ad79bd7e","Type":"ContainerStarted","Data":"18af9ef88634abf023c824bbe3b1e72d8f54ba620e9c0749e7996850850d7a13"} Feb 17 00:29:19 crc kubenswrapper[4791]: I0217 00:29:19.457639 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m"] Feb 17 00:29:19 crc kubenswrapper[4791]: W0217 00:29:19.472062 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21f15fa5_1f89_4aae_b6df_6a7c33630f43.slice/crio-b94f6544d5320ca846626d17ec597b56d2c2355b298e4f52e8a2f597da4803fd WatchSource:0}: Error finding container b94f6544d5320ca846626d17ec597b56d2c2355b298e4f52e8a2f597da4803fd: Status 404 returned error can't find the container with id b94f6544d5320ca846626d17ec597b56d2c2355b298e4f52e8a2f597da4803fd Feb 17 00:29:19 crc kubenswrapper[4791]: I0217 00:29:19.874621 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m" event={"ID":"21f15fa5-1f89-4aae-b6df-6a7c33630f43","Type":"ContainerStarted","Data":"bd9ce98b9eef1bbc5c184db64eaad0b62c27677fe65e67cfadff65adff90677c"} Feb 17 00:29:19 crc kubenswrapper[4791]: I0217 00:29:19.874664 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m" event={"ID":"21f15fa5-1f89-4aae-b6df-6a7c33630f43","Type":"ContainerStarted","Data":"b94f6544d5320ca846626d17ec597b56d2c2355b298e4f52e8a2f597da4803fd"} Feb 17 00:29:23 crc kubenswrapper[4791]: I0217 00:29:23.439566 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/prometheus-default-0" Feb 17 00:29:23 crc kubenswrapper[4791]: I0217 00:29:23.479974 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/prometheus-default-0" Feb 17 00:29:23 crc kubenswrapper[4791]: I0217 00:29:23.965979 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/prometheus-default-0" Feb 17 00:29:24 crc kubenswrapper[4791]: I0217 00:29:24.972857 4791 patch_prober.go:28] interesting pod/machine-config-daemon-9klkw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 00:29:24 crc kubenswrapper[4791]: I0217 00:29:24.973178 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 00:29:26 crc kubenswrapper[4791]: I0217 00:29:26.935853 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" event={"ID":"0d7ac416-17e0-4f86-8786-0afdec7fc240","Type":"ContainerStarted","Data":"388387806d1ad8d2a29e6701dc33e7f13547353ccae0456fc7d97053e594880f"} Feb 17 00:29:26 crc kubenswrapper[4791]: I0217 00:29:26.938087 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" event={"ID":"6e985bca-5f71-47e0-bd63-ede2ad79bd7e","Type":"ContainerStarted","Data":"c9c7d14efbce8d1901ca1ad40d55496b577d720f8fb2bb0d33040a97de15d3e1"} Feb 17 00:29:26 crc kubenswrapper[4791]: I0217 00:29:26.959375 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" podStartSLOduration=8.12337634 podStartE2EDuration="23.959313358s" podCreationTimestamp="2026-02-17 00:29:03 +0000 UTC" firstStartedPulling="2026-02-17 00:29:10.807731143 +0000 UTC m=+1408.287243670" lastFinishedPulling="2026-02-17 00:29:26.643668121 +0000 UTC m=+1424.123180688" observedRunningTime="2026-02-17 00:29:26.953387823 +0000 UTC m=+1424.432900350" watchObservedRunningTime="2026-02-17 00:29:26.959313358 +0000 UTC m=+1424.438825885" Feb 17 00:29:27 crc kubenswrapper[4791]: I0217 00:29:27.947856 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk" event={"ID":"4634db00-8e3c-4569-b66c-ef549eda9204","Type":"ContainerStarted","Data":"05dc36f926de66665bc35dd51670031593be97eb4ceb070b46accbe83c7ef398"} Feb 17 00:29:27 crc kubenswrapper[4791]: I0217 00:29:27.949202 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m" event={"ID":"21f15fa5-1f89-4aae-b6df-6a7c33630f43","Type":"ContainerStarted","Data":"10d44b843e268319ad274fcb5d47ba27034d14c8f5811714251ce04d0a229fe1"} Feb 17 00:29:27 crc kubenswrapper[4791]: I0217 00:29:27.954931 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" event={"ID":"e3d9a725-2f9c-4fcc-8610-4b297a3d689d","Type":"ContainerStarted","Data":"908cd5621ddb25167215f9bd3d62ca48f923f099199c4bf64035896d11a3df19"} Feb 17 00:29:27 crc kubenswrapper[4791]: I0217 00:29:27.982490 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" podStartSLOduration=5.138138323 podStartE2EDuration="18.982449715s" podCreationTimestamp="2026-02-17 00:29:09 +0000 UTC" firstStartedPulling="2026-02-17 00:29:12.842529151 +0000 UTC m=+1410.322041668" lastFinishedPulling="2026-02-17 00:29:26.686840493 +0000 UTC m=+1424.166353060" observedRunningTime="2026-02-17 00:29:26.97388673 +0000 UTC m=+1424.453399257" watchObservedRunningTime="2026-02-17 00:29:27.982449715 +0000 UTC m=+1425.461962342" Feb 17 00:29:28 crc kubenswrapper[4791]: I0217 00:29:28.017156 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" podStartSLOduration=6.476607603 podStartE2EDuration="22.017092762s" podCreationTimestamp="2026-02-17 00:29:06 +0000 UTC" firstStartedPulling="2026-02-17 00:29:11.296341393 +0000 UTC m=+1408.775853920" lastFinishedPulling="2026-02-17 00:29:26.836826552 +0000 UTC m=+1424.316339079" observedRunningTime="2026-02-17 00:29:28.002808708 +0000 UTC m=+1425.482321275" watchObservedRunningTime="2026-02-17 00:29:28.017092762 +0000 UTC m=+1425.496605329" Feb 17 00:29:28 crc kubenswrapper[4791]: I0217 00:29:28.017374 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk" podStartSLOduration=2.526549511 podStartE2EDuration="11.017363539s" podCreationTimestamp="2026-02-17 00:29:17 +0000 UTC" firstStartedPulling="2026-02-17 00:29:18.406273487 +0000 UTC m=+1415.885786014" lastFinishedPulling="2026-02-17 00:29:26.897087515 +0000 UTC m=+1424.376600042" observedRunningTime="2026-02-17 00:29:27.982795806 +0000 UTC m=+1425.462308373" watchObservedRunningTime="2026-02-17 00:29:28.017363539 +0000 UTC m=+1425.496876116" Feb 17 00:29:28 crc kubenswrapper[4791]: I0217 00:29:28.047259 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m" podStartSLOduration=2.545586632 podStartE2EDuration="10.047231398s" podCreationTimestamp="2026-02-17 00:29:18 +0000 UTC" firstStartedPulling="2026-02-17 00:29:19.476164157 +0000 UTC m=+1416.955676684" lastFinishedPulling="2026-02-17 00:29:26.977808923 +0000 UTC m=+1424.457321450" observedRunningTime="2026-02-17 00:29:28.028730173 +0000 UTC m=+1425.508242720" watchObservedRunningTime="2026-02-17 00:29:28.047231398 +0000 UTC m=+1425.526744085" Feb 17 00:29:30 crc kubenswrapper[4791]: I0217 00:29:30.733148 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-7q5s9"] Feb 17 00:29:30 crc kubenswrapper[4791]: I0217 00:29:30.733843 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" podUID="99be99ec-fe95-419c-ba3f-b4e3601e433a" containerName="default-interconnect" containerID="cri-o://5c0b32a599acff163bd3b75453552de8cfb3a2023cfaf54423af8eea541e17ba" gracePeriod=30 Feb 17 00:29:31 crc kubenswrapper[4791]: I0217 00:29:31.984021 4791 generic.go:334] "Generic (PLEG): container finished" podID="0d7ac416-17e0-4f86-8786-0afdec7fc240" containerID="74d069a0a7b95b2c13b5d7774c37eff82d25f60a2056b32e4a8abff250caf0b5" exitCode=0 Feb 17 00:29:31 crc kubenswrapper[4791]: I0217 00:29:31.984132 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" event={"ID":"0d7ac416-17e0-4f86-8786-0afdec7fc240","Type":"ContainerDied","Data":"74d069a0a7b95b2c13b5d7774c37eff82d25f60a2056b32e4a8abff250caf0b5"} Feb 17 00:29:31 crc kubenswrapper[4791]: I0217 00:29:31.984976 4791 scope.go:117] "RemoveContainer" containerID="74d069a0a7b95b2c13b5d7774c37eff82d25f60a2056b32e4a8abff250caf0b5" Feb 17 00:29:31 crc kubenswrapper[4791]: I0217 00:29:31.990707 4791 generic.go:334] "Generic (PLEG): container finished" podID="21f15fa5-1f89-4aae-b6df-6a7c33630f43" containerID="bd9ce98b9eef1bbc5c184db64eaad0b62c27677fe65e67cfadff65adff90677c" exitCode=0 Feb 17 00:29:31 crc kubenswrapper[4791]: I0217 00:29:31.990817 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m" event={"ID":"21f15fa5-1f89-4aae-b6df-6a7c33630f43","Type":"ContainerDied","Data":"bd9ce98b9eef1bbc5c184db64eaad0b62c27677fe65e67cfadff65adff90677c"} Feb 17 00:29:31 crc kubenswrapper[4791]: I0217 00:29:31.991440 4791 scope.go:117] "RemoveContainer" containerID="bd9ce98b9eef1bbc5c184db64eaad0b62c27677fe65e67cfadff65adff90677c" Feb 17 00:29:31 crc kubenswrapper[4791]: I0217 00:29:31.996947 4791 generic.go:334] "Generic (PLEG): container finished" podID="6e985bca-5f71-47e0-bd63-ede2ad79bd7e" containerID="18af9ef88634abf023c824bbe3b1e72d8f54ba620e9c0749e7996850850d7a13" exitCode=0 Feb 17 00:29:31 crc kubenswrapper[4791]: I0217 00:29:31.997021 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" event={"ID":"6e985bca-5f71-47e0-bd63-ede2ad79bd7e","Type":"ContainerDied","Data":"18af9ef88634abf023c824bbe3b1e72d8f54ba620e9c0749e7996850850d7a13"} Feb 17 00:29:31 crc kubenswrapper[4791]: I0217 00:29:31.997573 4791 scope.go:117] "RemoveContainer" containerID="18af9ef88634abf023c824bbe3b1e72d8f54ba620e9c0749e7996850850d7a13" Feb 17 00:29:32 crc kubenswrapper[4791]: I0217 00:29:32.005056 4791 generic.go:334] "Generic (PLEG): container finished" podID="99be99ec-fe95-419c-ba3f-b4e3601e433a" containerID="5c0b32a599acff163bd3b75453552de8cfb3a2023cfaf54423af8eea541e17ba" exitCode=0 Feb 17 00:29:32 crc kubenswrapper[4791]: I0217 00:29:32.005165 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" event={"ID":"99be99ec-fe95-419c-ba3f-b4e3601e433a","Type":"ContainerDied","Data":"5c0b32a599acff163bd3b75453552de8cfb3a2023cfaf54423af8eea541e17ba"} Feb 17 00:29:32 crc kubenswrapper[4791]: I0217 00:29:32.006859 4791 generic.go:334] "Generic (PLEG): container finished" podID="e3d9a725-2f9c-4fcc-8610-4b297a3d689d" containerID="ac04f34d07519fe9dd3a66842e1d7ce4acf0bcb7a4067dd4c43b6fa708598d8c" exitCode=0 Feb 17 00:29:32 crc kubenswrapper[4791]: I0217 00:29:32.006897 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" event={"ID":"e3d9a725-2f9c-4fcc-8610-4b297a3d689d","Type":"ContainerDied","Data":"ac04f34d07519fe9dd3a66842e1d7ce4acf0bcb7a4067dd4c43b6fa708598d8c"} Feb 17 00:29:32 crc kubenswrapper[4791]: I0217 00:29:32.007410 4791 scope.go:117] "RemoveContainer" containerID="ac04f34d07519fe9dd3a66842e1d7ce4acf0bcb7a4067dd4c43b6fa708598d8c" Feb 17 00:29:33 crc kubenswrapper[4791]: I0217 00:29:33.013608 4791 generic.go:334] "Generic (PLEG): container finished" podID="4634db00-8e3c-4569-b66c-ef549eda9204" containerID="bd8e5e049c1bd26e9cbfcd750d20fa2c47955c830e7d64eaa2e70221b3dbd377" exitCode=0 Feb 17 00:29:33 crc kubenswrapper[4791]: I0217 00:29:33.013960 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk" event={"ID":"4634db00-8e3c-4569-b66c-ef549eda9204","Type":"ContainerDied","Data":"bd8e5e049c1bd26e9cbfcd750d20fa2c47955c830e7d64eaa2e70221b3dbd377"} Feb 17 00:29:33 crc kubenswrapper[4791]: I0217 00:29:33.014477 4791 scope.go:117] "RemoveContainer" containerID="bd8e5e049c1bd26e9cbfcd750d20fa2c47955c830e7d64eaa2e70221b3dbd377" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.716954 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.769974 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-sjw5t"] Feb 17 00:29:34 crc kubenswrapper[4791]: E0217 00:29:34.770366 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99be99ec-fe95-419c-ba3f-b4e3601e433a" containerName="default-interconnect" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.770385 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="99be99ec-fe95-419c-ba3f-b4e3601e433a" containerName="default-interconnect" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.770517 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="99be99ec-fe95-419c-ba3f-b4e3601e433a" containerName="default-interconnect" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.771161 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.779690 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-sjw5t"] Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.883974 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-sasl-users\") pod \"99be99ec-fe95-419c-ba3f-b4e3601e433a\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.884043 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-default-interconnect-inter-router-credentials\") pod \"99be99ec-fe95-419c-ba3f-b4e3601e433a\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.884119 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/99be99ec-fe95-419c-ba3f-b4e3601e433a-sasl-config\") pod \"99be99ec-fe95-419c-ba3f-b4e3601e433a\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.884181 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-default-interconnect-openstack-ca\") pod \"99be99ec-fe95-419c-ba3f-b4e3601e433a\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.884222 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-default-interconnect-openstack-credentials\") pod \"99be99ec-fe95-419c-ba3f-b4e3601e433a\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.884253 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-default-interconnect-inter-router-ca\") pod \"99be99ec-fe95-419c-ba3f-b4e3601e433a\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.884315 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qt7d8\" (UniqueName: \"kubernetes.io/projected/99be99ec-fe95-419c-ba3f-b4e3601e433a-kube-api-access-qt7d8\") pod \"99be99ec-fe95-419c-ba3f-b4e3601e433a\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.884483 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/9caafe24-6ee4-425b-b175-c0901dab223f-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-sjw5t\" (UID: \"9caafe24-6ee4-425b-b175-c0901dab223f\") " pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.884516 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/9caafe24-6ee4-425b-b175-c0901dab223f-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-sjw5t\" (UID: \"9caafe24-6ee4-425b-b175-c0901dab223f\") " pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.884552 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/9caafe24-6ee4-425b-b175-c0901dab223f-sasl-config\") pod \"default-interconnect-68864d46cb-sjw5t\" (UID: \"9caafe24-6ee4-425b-b175-c0901dab223f\") " pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.884604 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjhgv\" (UniqueName: \"kubernetes.io/projected/9caafe24-6ee4-425b-b175-c0901dab223f-kube-api-access-vjhgv\") pod \"default-interconnect-68864d46cb-sjw5t\" (UID: \"9caafe24-6ee4-425b-b175-c0901dab223f\") " pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.884665 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/9caafe24-6ee4-425b-b175-c0901dab223f-sasl-users\") pod \"default-interconnect-68864d46cb-sjw5t\" (UID: \"9caafe24-6ee4-425b-b175-c0901dab223f\") " pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.884716 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/9caafe24-6ee4-425b-b175-c0901dab223f-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-sjw5t\" (UID: \"9caafe24-6ee4-425b-b175-c0901dab223f\") " pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.884747 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/9caafe24-6ee4-425b-b175-c0901dab223f-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-sjw5t\" (UID: \"9caafe24-6ee4-425b-b175-c0901dab223f\") " pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.884840 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99be99ec-fe95-419c-ba3f-b4e3601e433a-sasl-config" (OuterVolumeSpecName: "sasl-config") pod "99be99ec-fe95-419c-ba3f-b4e3601e433a" (UID: "99be99ec-fe95-419c-ba3f-b4e3601e433a"). InnerVolumeSpecName "sasl-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.889295 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-default-interconnect-openstack-credentials" (OuterVolumeSpecName: "default-interconnect-openstack-credentials") pod "99be99ec-fe95-419c-ba3f-b4e3601e433a" (UID: "99be99ec-fe95-419c-ba3f-b4e3601e433a"). InnerVolumeSpecName "default-interconnect-openstack-credentials". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.889777 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-default-interconnect-openstack-ca" (OuterVolumeSpecName: "default-interconnect-openstack-ca") pod "99be99ec-fe95-419c-ba3f-b4e3601e433a" (UID: "99be99ec-fe95-419c-ba3f-b4e3601e433a"). InnerVolumeSpecName "default-interconnect-openstack-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.890051 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-sasl-users" (OuterVolumeSpecName: "sasl-users") pod "99be99ec-fe95-419c-ba3f-b4e3601e433a" (UID: "99be99ec-fe95-419c-ba3f-b4e3601e433a"). InnerVolumeSpecName "sasl-users". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.890694 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-default-interconnect-inter-router-credentials" (OuterVolumeSpecName: "default-interconnect-inter-router-credentials") pod "99be99ec-fe95-419c-ba3f-b4e3601e433a" (UID: "99be99ec-fe95-419c-ba3f-b4e3601e433a"). InnerVolumeSpecName "default-interconnect-inter-router-credentials". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.891333 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-default-interconnect-inter-router-ca" (OuterVolumeSpecName: "default-interconnect-inter-router-ca") pod "99be99ec-fe95-419c-ba3f-b4e3601e433a" (UID: "99be99ec-fe95-419c-ba3f-b4e3601e433a"). InnerVolumeSpecName "default-interconnect-inter-router-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.894223 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99be99ec-fe95-419c-ba3f-b4e3601e433a-kube-api-access-qt7d8" (OuterVolumeSpecName: "kube-api-access-qt7d8") pod "99be99ec-fe95-419c-ba3f-b4e3601e433a" (UID: "99be99ec-fe95-419c-ba3f-b4e3601e433a"). InnerVolumeSpecName "kube-api-access-qt7d8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.986230 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/9caafe24-6ee4-425b-b175-c0901dab223f-sasl-users\") pod \"default-interconnect-68864d46cb-sjw5t\" (UID: \"9caafe24-6ee4-425b-b175-c0901dab223f\") " pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.986295 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/9caafe24-6ee4-425b-b175-c0901dab223f-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-sjw5t\" (UID: \"9caafe24-6ee4-425b-b175-c0901dab223f\") " pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.986877 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/9caafe24-6ee4-425b-b175-c0901dab223f-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-sjw5t\" (UID: \"9caafe24-6ee4-425b-b175-c0901dab223f\") " pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.986940 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/9caafe24-6ee4-425b-b175-c0901dab223f-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-sjw5t\" (UID: \"9caafe24-6ee4-425b-b175-c0901dab223f\") " pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.986964 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/9caafe24-6ee4-425b-b175-c0901dab223f-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-sjw5t\" (UID: \"9caafe24-6ee4-425b-b175-c0901dab223f\") " pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.987026 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/9caafe24-6ee4-425b-b175-c0901dab223f-sasl-config\") pod \"default-interconnect-68864d46cb-sjw5t\" (UID: \"9caafe24-6ee4-425b-b175-c0901dab223f\") " pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.987081 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjhgv\" (UniqueName: \"kubernetes.io/projected/9caafe24-6ee4-425b-b175-c0901dab223f-kube-api-access-vjhgv\") pod \"default-interconnect-68864d46cb-sjw5t\" (UID: \"9caafe24-6ee4-425b-b175-c0901dab223f\") " pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.987163 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qt7d8\" (UniqueName: \"kubernetes.io/projected/99be99ec-fe95-419c-ba3f-b4e3601e433a-kube-api-access-qt7d8\") on node \"crc\" DevicePath \"\"" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.987184 4791 reconciler_common.go:293] "Volume detached for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-sasl-users\") on node \"crc\" DevicePath \"\"" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.987197 4791 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-default-interconnect-inter-router-credentials\") on node \"crc\" DevicePath \"\"" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.987209 4791 reconciler_common.go:293] "Volume detached for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/99be99ec-fe95-419c-ba3f-b4e3601e433a-sasl-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.987221 4791 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-default-interconnect-openstack-ca\") on node \"crc\" DevicePath \"\"" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.987235 4791 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-default-interconnect-openstack-credentials\") on node \"crc\" DevicePath \"\"" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.987249 4791 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-default-interconnect-inter-router-ca\") on node \"crc\" DevicePath \"\"" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.988263 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/9caafe24-6ee4-425b-b175-c0901dab223f-sasl-config\") pod \"default-interconnect-68864d46cb-sjw5t\" (UID: \"9caafe24-6ee4-425b-b175-c0901dab223f\") " pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.993024 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/9caafe24-6ee4-425b-b175-c0901dab223f-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-sjw5t\" (UID: \"9caafe24-6ee4-425b-b175-c0901dab223f\") " pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.996515 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/9caafe24-6ee4-425b-b175-c0901dab223f-sasl-users\") pod \"default-interconnect-68864d46cb-sjw5t\" (UID: \"9caafe24-6ee4-425b-b175-c0901dab223f\") " pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.997683 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/9caafe24-6ee4-425b-b175-c0901dab223f-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-sjw5t\" (UID: \"9caafe24-6ee4-425b-b175-c0901dab223f\") " pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.997892 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/9caafe24-6ee4-425b-b175-c0901dab223f-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-sjw5t\" (UID: \"9caafe24-6ee4-425b-b175-c0901dab223f\") " pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.997917 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/9caafe24-6ee4-425b-b175-c0901dab223f-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-sjw5t\" (UID: \"9caafe24-6ee4-425b-b175-c0901dab223f\") " pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" Feb 17 00:29:35 crc kubenswrapper[4791]: I0217 00:29:35.005259 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjhgv\" (UniqueName: \"kubernetes.io/projected/9caafe24-6ee4-425b-b175-c0901dab223f-kube-api-access-vjhgv\") pod \"default-interconnect-68864d46cb-sjw5t\" (UID: \"9caafe24-6ee4-425b-b175-c0901dab223f\") " pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" Feb 17 00:29:35 crc kubenswrapper[4791]: I0217 00:29:35.027325 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" event={"ID":"99be99ec-fe95-419c-ba3f-b4e3601e433a","Type":"ContainerDied","Data":"b23f71e1251fc3bfc8c2584134054c100e44edda849e4b8dcd9907f81f1b91b4"} Feb 17 00:29:35 crc kubenswrapper[4791]: I0217 00:29:35.027380 4791 scope.go:117] "RemoveContainer" containerID="5c0b32a599acff163bd3b75453552de8cfb3a2023cfaf54423af8eea541e17ba" Feb 17 00:29:35 crc kubenswrapper[4791]: I0217 00:29:35.027391 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:29:35 crc kubenswrapper[4791]: I0217 00:29:35.063939 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-7q5s9"] Feb 17 00:29:35 crc kubenswrapper[4791]: I0217 00:29:35.071284 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-7q5s9"] Feb 17 00:29:35 crc kubenswrapper[4791]: I0217 00:29:35.095405 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" Feb 17 00:29:35 crc kubenswrapper[4791]: I0217 00:29:35.240388 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99be99ec-fe95-419c-ba3f-b4e3601e433a" path="/var/lib/kubelet/pods/99be99ec-fe95-419c-ba3f-b4e3601e433a/volumes" Feb 17 00:29:35 crc kubenswrapper[4791]: I0217 00:29:35.318467 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-sjw5t"] Feb 17 00:29:35 crc kubenswrapper[4791]: W0217 00:29:35.327461 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9caafe24_6ee4_425b_b175_c0901dab223f.slice/crio-3c674d86e19899584af48acb7188101abc04293d15b5b2d615c7650df2ebbcfc WatchSource:0}: Error finding container 3c674d86e19899584af48acb7188101abc04293d15b5b2d615c7650df2ebbcfc: Status 404 returned error can't find the container with id 3c674d86e19899584af48acb7188101abc04293d15b5b2d615c7650df2ebbcfc Feb 17 00:29:36 crc kubenswrapper[4791]: I0217 00:29:36.034323 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" event={"ID":"9caafe24-6ee4-425b-b175-c0901dab223f","Type":"ContainerStarted","Data":"a9257b2f6143e5fdbef6d9970265ea2019fa470cadca2aa64b11d9eed6b20c4f"} Feb 17 00:29:36 crc kubenswrapper[4791]: I0217 00:29:36.034365 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" event={"ID":"9caafe24-6ee4-425b-b175-c0901dab223f","Type":"ContainerStarted","Data":"3c674d86e19899584af48acb7188101abc04293d15b5b2d615c7650df2ebbcfc"} Feb 17 00:29:36 crc kubenswrapper[4791]: I0217 00:29:36.061129 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" podStartSLOduration=6.061090665 podStartE2EDuration="6.061090665s" podCreationTimestamp="2026-02-17 00:29:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:29:36.056243804 +0000 UTC m=+1433.535756321" watchObservedRunningTime="2026-02-17 00:29:36.061090665 +0000 UTC m=+1433.540603212" Feb 17 00:29:37 crc kubenswrapper[4791]: I0217 00:29:37.042965 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" event={"ID":"0d7ac416-17e0-4f86-8786-0afdec7fc240","Type":"ContainerStarted","Data":"750934549b030909bb4d598855ca653fb2ecc38c95f4654335e4f92f04b972c6"} Feb 17 00:29:37 crc kubenswrapper[4791]: I0217 00:29:37.046191 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk" event={"ID":"4634db00-8e3c-4569-b66c-ef549eda9204","Type":"ContainerStarted","Data":"ce59d847af01180940df3f09f6d2496601ef0cdc8918bf8c9cacb859dc5df681"} Feb 17 00:29:37 crc kubenswrapper[4791]: I0217 00:29:37.048835 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m" event={"ID":"21f15fa5-1f89-4aae-b6df-6a7c33630f43","Type":"ContainerStarted","Data":"698ff17b1cc097462d0e76ae27f971dd27b7d83fda12b9acb2a842abdcc4a4c6"} Feb 17 00:29:37 crc kubenswrapper[4791]: I0217 00:29:37.051869 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" event={"ID":"6e985bca-5f71-47e0-bd63-ede2ad79bd7e","Type":"ContainerStarted","Data":"440d8a71773e87a97c65f1be17bc069de0c55e9696ecb2c7657667e4e4affa5c"} Feb 17 00:29:37 crc kubenswrapper[4791]: I0217 00:29:37.056023 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" event={"ID":"e3d9a725-2f9c-4fcc-8610-4b297a3d689d","Type":"ContainerStarted","Data":"df1187e2bef149dfd491161a3578bd8f2992980e1c6e4dd9f867e6531a6d761f"} Feb 17 00:29:37 crc kubenswrapper[4791]: I0217 00:29:37.883296 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/qdr-test"] Feb 17 00:29:37 crc kubenswrapper[4791]: I0217 00:29:37.884388 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/qdr-test" Feb 17 00:29:37 crc kubenswrapper[4791]: I0217 00:29:37.886530 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"qdr-test-config" Feb 17 00:29:37 crc kubenswrapper[4791]: I0217 00:29:37.886983 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-selfsigned" Feb 17 00:29:37 crc kubenswrapper[4791]: I0217 00:29:37.894363 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/qdr-test"] Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.031610 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/43948ca6-1e04-4a7f-867d-d5f6d69d240d-qdr-test-config\") pod \"qdr-test\" (UID: \"43948ca6-1e04-4a7f-867d-d5f6d69d240d\") " pod="service-telemetry/qdr-test" Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.031655 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v72ht\" (UniqueName: \"kubernetes.io/projected/43948ca6-1e04-4a7f-867d-d5f6d69d240d-kube-api-access-v72ht\") pod \"qdr-test\" (UID: \"43948ca6-1e04-4a7f-867d-d5f6d69d240d\") " pod="service-telemetry/qdr-test" Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.031685 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/43948ca6-1e04-4a7f-867d-d5f6d69d240d-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"43948ca6-1e04-4a7f-867d-d5f6d69d240d\") " pod="service-telemetry/qdr-test" Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.064694 4791 generic.go:334] "Generic (PLEG): container finished" podID="4634db00-8e3c-4569-b66c-ef549eda9204" containerID="ce59d847af01180940df3f09f6d2496601ef0cdc8918bf8c9cacb859dc5df681" exitCode=0 Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.064776 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk" event={"ID":"4634db00-8e3c-4569-b66c-ef549eda9204","Type":"ContainerDied","Data":"ce59d847af01180940df3f09f6d2496601ef0cdc8918bf8c9cacb859dc5df681"} Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.064836 4791 scope.go:117] "RemoveContainer" containerID="bd8e5e049c1bd26e9cbfcd750d20fa2c47955c830e7d64eaa2e70221b3dbd377" Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.065444 4791 scope.go:117] "RemoveContainer" containerID="ce59d847af01180940df3f09f6d2496601ef0cdc8918bf8c9cacb859dc5df681" Feb 17 00:29:38 crc kubenswrapper[4791]: E0217 00:29:38.065742 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk_service-telemetry(4634db00-8e3c-4569-b66c-ef549eda9204)\"" pod="service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk" podUID="4634db00-8e3c-4569-b66c-ef549eda9204" Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.067274 4791 generic.go:334] "Generic (PLEG): container finished" podID="6e985bca-5f71-47e0-bd63-ede2ad79bd7e" containerID="440d8a71773e87a97c65f1be17bc069de0c55e9696ecb2c7657667e4e4affa5c" exitCode=0 Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.067349 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" event={"ID":"6e985bca-5f71-47e0-bd63-ede2ad79bd7e","Type":"ContainerDied","Data":"440d8a71773e87a97c65f1be17bc069de0c55e9696ecb2c7657667e4e4affa5c"} Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.067726 4791 scope.go:117] "RemoveContainer" containerID="440d8a71773e87a97c65f1be17bc069de0c55e9696ecb2c7657667e4e4affa5c" Feb 17 00:29:38 crc kubenswrapper[4791]: E0217 00:29:38.067929 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p_service-telemetry(6e985bca-5f71-47e0-bd63-ede2ad79bd7e)\"" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" podUID="6e985bca-5f71-47e0-bd63-ede2ad79bd7e" Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.069723 4791 generic.go:334] "Generic (PLEG): container finished" podID="e3d9a725-2f9c-4fcc-8610-4b297a3d689d" containerID="df1187e2bef149dfd491161a3578bd8f2992980e1c6e4dd9f867e6531a6d761f" exitCode=0 Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.069781 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" event={"ID":"e3d9a725-2f9c-4fcc-8610-4b297a3d689d","Type":"ContainerDied","Data":"df1187e2bef149dfd491161a3578bd8f2992980e1c6e4dd9f867e6531a6d761f"} Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.070331 4791 scope.go:117] "RemoveContainer" containerID="df1187e2bef149dfd491161a3578bd8f2992980e1c6e4dd9f867e6531a6d761f" Feb 17 00:29:38 crc kubenswrapper[4791]: E0217 00:29:38.070621 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9_service-telemetry(e3d9a725-2f9c-4fcc-8610-4b297a3d689d)\"" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" podUID="e3d9a725-2f9c-4fcc-8610-4b297a3d689d" Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.072731 4791 generic.go:334] "Generic (PLEG): container finished" podID="0d7ac416-17e0-4f86-8786-0afdec7fc240" containerID="750934549b030909bb4d598855ca653fb2ecc38c95f4654335e4f92f04b972c6" exitCode=0 Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.072777 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" event={"ID":"0d7ac416-17e0-4f86-8786-0afdec7fc240","Type":"ContainerDied","Data":"750934549b030909bb4d598855ca653fb2ecc38c95f4654335e4f92f04b972c6"} Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.073098 4791 scope.go:117] "RemoveContainer" containerID="750934549b030909bb4d598855ca653fb2ecc38c95f4654335e4f92f04b972c6" Feb 17 00:29:38 crc kubenswrapper[4791]: E0217 00:29:38.073284 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8_service-telemetry(0d7ac416-17e0-4f86-8786-0afdec7fc240)\"" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" podUID="0d7ac416-17e0-4f86-8786-0afdec7fc240" Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.083031 4791 generic.go:334] "Generic (PLEG): container finished" podID="21f15fa5-1f89-4aae-b6df-6a7c33630f43" containerID="698ff17b1cc097462d0e76ae27f971dd27b7d83fda12b9acb2a842abdcc4a4c6" exitCode=0 Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.083076 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m" event={"ID":"21f15fa5-1f89-4aae-b6df-6a7c33630f43","Type":"ContainerDied","Data":"698ff17b1cc097462d0e76ae27f971dd27b7d83fda12b9acb2a842abdcc4a4c6"} Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.083718 4791 scope.go:117] "RemoveContainer" containerID="698ff17b1cc097462d0e76ae27f971dd27b7d83fda12b9acb2a842abdcc4a4c6" Feb 17 00:29:38 crc kubenswrapper[4791]: E0217 00:29:38.083926 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m_service-telemetry(21f15fa5-1f89-4aae-b6df-6a7c33630f43)\"" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m" podUID="21f15fa5-1f89-4aae-b6df-6a7c33630f43" Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.107315 4791 scope.go:117] "RemoveContainer" containerID="18af9ef88634abf023c824bbe3b1e72d8f54ba620e9c0749e7996850850d7a13" Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.132876 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/43948ca6-1e04-4a7f-867d-d5f6d69d240d-qdr-test-config\") pod \"qdr-test\" (UID: \"43948ca6-1e04-4a7f-867d-d5f6d69d240d\") " pod="service-telemetry/qdr-test" Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.132936 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v72ht\" (UniqueName: \"kubernetes.io/projected/43948ca6-1e04-4a7f-867d-d5f6d69d240d-kube-api-access-v72ht\") pod \"qdr-test\" (UID: \"43948ca6-1e04-4a7f-867d-d5f6d69d240d\") " pod="service-telemetry/qdr-test" Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.132983 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/43948ca6-1e04-4a7f-867d-d5f6d69d240d-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"43948ca6-1e04-4a7f-867d-d5f6d69d240d\") " pod="service-telemetry/qdr-test" Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.133662 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/43948ca6-1e04-4a7f-867d-d5f6d69d240d-qdr-test-config\") pod \"qdr-test\" (UID: \"43948ca6-1e04-4a7f-867d-d5f6d69d240d\") " pod="service-telemetry/qdr-test" Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.143346 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/43948ca6-1e04-4a7f-867d-d5f6d69d240d-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"43948ca6-1e04-4a7f-867d-d5f6d69d240d\") " pod="service-telemetry/qdr-test" Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.143513 4791 scope.go:117] "RemoveContainer" containerID="ac04f34d07519fe9dd3a66842e1d7ce4acf0bcb7a4067dd4c43b6fa708598d8c" Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.165817 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v72ht\" (UniqueName: \"kubernetes.io/projected/43948ca6-1e04-4a7f-867d-d5f6d69d240d-kube-api-access-v72ht\") pod \"qdr-test\" (UID: \"43948ca6-1e04-4a7f-867d-d5f6d69d240d\") " pod="service-telemetry/qdr-test" Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.207927 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/qdr-test" Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.240392 4791 scope.go:117] "RemoveContainer" containerID="74d069a0a7b95b2c13b5d7774c37eff82d25f60a2056b32e4a8abff250caf0b5" Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.274423 4791 scope.go:117] "RemoveContainer" containerID="bd9ce98b9eef1bbc5c184db64eaad0b62c27677fe65e67cfadff65adff90677c" Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.642541 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/qdr-test"] Feb 17 00:29:38 crc kubenswrapper[4791]: W0217 00:29:38.646234 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43948ca6_1e04_4a7f_867d_d5f6d69d240d.slice/crio-20f076de0e2e2ce145756cbedde9e7832677ea840dc57057ccc25152e0394c39 WatchSource:0}: Error finding container 20f076de0e2e2ce145756cbedde9e7832677ea840dc57057ccc25152e0394c39: Status 404 returned error can't find the container with id 20f076de0e2e2ce145756cbedde9e7832677ea840dc57057ccc25152e0394c39 Feb 17 00:29:39 crc kubenswrapper[4791]: I0217 00:29:39.096173 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/qdr-test" event={"ID":"43948ca6-1e04-4a7f-867d-d5f6d69d240d","Type":"ContainerStarted","Data":"20f076de0e2e2ce145756cbedde9e7832677ea840dc57057ccc25152e0394c39"} Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.162177 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/qdr-test" event={"ID":"43948ca6-1e04-4a7f-867d-d5f6d69d240d","Type":"ContainerStarted","Data":"a33231ba3a43a59d314c17bc1db0dffe12835a7ebb8205582c8fe6f3e5425e5d"} Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.178870 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/qdr-test" podStartSLOduration=2.51780443 podStartE2EDuration="10.178855847s" podCreationTimestamp="2026-02-17 00:29:37 +0000 UTC" firstStartedPulling="2026-02-17 00:29:38.648691498 +0000 UTC m=+1436.128204025" lastFinishedPulling="2026-02-17 00:29:46.309742915 +0000 UTC m=+1443.789255442" observedRunningTime="2026-02-17 00:29:47.175977238 +0000 UTC m=+1444.655489765" watchObservedRunningTime="2026-02-17 00:29:47.178855847 +0000 UTC m=+1444.658368374" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.525735 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/stf-smoketest-smoke1-k8frg"] Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.527491 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.530195 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-ceilometer-entrypoint-script" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.532251 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-collectd-entrypoint-script" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.532479 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-ceilometer-publisher" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.534622 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-healthcheck-log" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.536530 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-k8frg"] Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.536712 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-sensubility-config" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.537475 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-collectd-config" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.593846 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsx9p\" (UniqueName: \"kubernetes.io/projected/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-kube-api-access-lsx9p\") pod \"stf-smoketest-smoke1-k8frg\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.593888 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-ceilometer-publisher\") pod \"stf-smoketest-smoke1-k8frg\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.593914 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-healthcheck-log\") pod \"stf-smoketest-smoke1-k8frg\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.593938 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-collectd-config\") pod \"stf-smoketest-smoke1-k8frg\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.593957 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-k8frg\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.594141 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-sensubility-config\") pod \"stf-smoketest-smoke1-k8frg\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.594163 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-k8frg\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.695059 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-collectd-config\") pod \"stf-smoketest-smoke1-k8frg\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.695177 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-k8frg\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.695331 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-sensubility-config\") pod \"stf-smoketest-smoke1-k8frg\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.695369 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-k8frg\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.695419 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsx9p\" (UniqueName: \"kubernetes.io/projected/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-kube-api-access-lsx9p\") pod \"stf-smoketest-smoke1-k8frg\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.695456 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-ceilometer-publisher\") pod \"stf-smoketest-smoke1-k8frg\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.695495 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-healthcheck-log\") pod \"stf-smoketest-smoke1-k8frg\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.696006 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-collectd-config\") pod \"stf-smoketest-smoke1-k8frg\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.696738 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-sensubility-config\") pod \"stf-smoketest-smoke1-k8frg\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.696796 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-k8frg\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.697095 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-k8frg\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.697093 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-healthcheck-log\") pod \"stf-smoketest-smoke1-k8frg\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.697508 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-ceilometer-publisher\") pod \"stf-smoketest-smoke1-k8frg\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.737494 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsx9p\" (UniqueName: \"kubernetes.io/projected/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-kube-api-access-lsx9p\") pod \"stf-smoketest-smoke1-k8frg\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.847435 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:29:48 crc kubenswrapper[4791]: I0217 00:29:48.011650 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/curl"] Feb 17 00:29:48 crc kubenswrapper[4791]: I0217 00:29:48.023432 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/curl"] Feb 17 00:29:48 crc kubenswrapper[4791]: I0217 00:29:48.023548 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Feb 17 00:29:48 crc kubenswrapper[4791]: I0217 00:29:48.099909 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-k8frg"] Feb 17 00:29:48 crc kubenswrapper[4791]: I0217 00:29:48.100533 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqsb9\" (UniqueName: \"kubernetes.io/projected/5fbf81dd-6a9e-4c2b-a46a-1461a3a5b094-kube-api-access-zqsb9\") pod \"curl\" (UID: \"5fbf81dd-6a9e-4c2b-a46a-1461a3a5b094\") " pod="service-telemetry/curl" Feb 17 00:29:48 crc kubenswrapper[4791]: I0217 00:29:48.172043 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-k8frg" event={"ID":"e8590bb8-aabf-4bf2-b28d-f1c7b0872780","Type":"ContainerStarted","Data":"fb78639120c2ceb67338064e7f2d7aba445e0c94d0f5aa427da45e085399f0fb"} Feb 17 00:29:48 crc kubenswrapper[4791]: I0217 00:29:48.201610 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqsb9\" (UniqueName: \"kubernetes.io/projected/5fbf81dd-6a9e-4c2b-a46a-1461a3a5b094-kube-api-access-zqsb9\") pod \"curl\" (UID: \"5fbf81dd-6a9e-4c2b-a46a-1461a3a5b094\") " pod="service-telemetry/curl" Feb 17 00:29:48 crc kubenswrapper[4791]: I0217 00:29:48.225790 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqsb9\" (UniqueName: \"kubernetes.io/projected/5fbf81dd-6a9e-4c2b-a46a-1461a3a5b094-kube-api-access-zqsb9\") pod \"curl\" (UID: \"5fbf81dd-6a9e-4c2b-a46a-1461a3a5b094\") " pod="service-telemetry/curl" Feb 17 00:29:48 crc kubenswrapper[4791]: I0217 00:29:48.353639 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Feb 17 00:29:48 crc kubenswrapper[4791]: I0217 00:29:48.782353 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/curl"] Feb 17 00:29:49 crc kubenswrapper[4791]: I0217 00:29:49.178935 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"5fbf81dd-6a9e-4c2b-a46a-1461a3a5b094","Type":"ContainerStarted","Data":"690037d92f4fa5c73c59322e6169368ac5590df450d4325ba298a0d94081d46b"} Feb 17 00:29:49 crc kubenswrapper[4791]: I0217 00:29:49.220468 4791 scope.go:117] "RemoveContainer" containerID="440d8a71773e87a97c65f1be17bc069de0c55e9696ecb2c7657667e4e4affa5c" Feb 17 00:29:49 crc kubenswrapper[4791]: I0217 00:29:49.220523 4791 scope.go:117] "RemoveContainer" containerID="750934549b030909bb4d598855ca653fb2ecc38c95f4654335e4f92f04b972c6" Feb 17 00:29:49 crc kubenswrapper[4791]: I0217 00:29:49.220588 4791 scope.go:117] "RemoveContainer" containerID="ce59d847af01180940df3f09f6d2496601ef0cdc8918bf8c9cacb859dc5df681" Feb 17 00:29:50 crc kubenswrapper[4791]: I0217 00:29:50.200783 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk" event={"ID":"4634db00-8e3c-4569-b66c-ef549eda9204","Type":"ContainerStarted","Data":"02ff514e88ff70cec9d63b954878a0505cc7c820d435f196394093abc4284b2a"} Feb 17 00:29:50 crc kubenswrapper[4791]: I0217 00:29:50.204889 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" event={"ID":"0d7ac416-17e0-4f86-8786-0afdec7fc240","Type":"ContainerStarted","Data":"97ce3fc7429c7ac09971828969b2933496e787c2cc86f2e4d6b34ea7a9bf3cd4"} Feb 17 00:29:50 crc kubenswrapper[4791]: I0217 00:29:50.207890 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" event={"ID":"6e985bca-5f71-47e0-bd63-ede2ad79bd7e","Type":"ContainerStarted","Data":"dcaf66b043a2c6b72c0b3c5c46769a60605df0b5eb5a14becae05bced65affbb"} Feb 17 00:29:51 crc kubenswrapper[4791]: I0217 00:29:51.214461 4791 generic.go:334] "Generic (PLEG): container finished" podID="5fbf81dd-6a9e-4c2b-a46a-1461a3a5b094" containerID="138acf522734f6f66553cdc60197343cc0def84117ff762dbfd2617d285bfd2d" exitCode=0 Feb 17 00:29:51 crc kubenswrapper[4791]: I0217 00:29:51.214499 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"5fbf81dd-6a9e-4c2b-a46a-1461a3a5b094","Type":"ContainerDied","Data":"138acf522734f6f66553cdc60197343cc0def84117ff762dbfd2617d285bfd2d"} Feb 17 00:29:51 crc kubenswrapper[4791]: I0217 00:29:51.220643 4791 scope.go:117] "RemoveContainer" containerID="698ff17b1cc097462d0e76ae27f971dd27b7d83fda12b9acb2a842abdcc4a4c6" Feb 17 00:29:53 crc kubenswrapper[4791]: I0217 00:29:53.227891 4791 scope.go:117] "RemoveContainer" containerID="df1187e2bef149dfd491161a3578bd8f2992980e1c6e4dd9f867e6531a6d761f" Feb 17 00:29:54 crc kubenswrapper[4791]: I0217 00:29:54.417123 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Feb 17 00:29:54 crc kubenswrapper[4791]: I0217 00:29:54.507229 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqsb9\" (UniqueName: \"kubernetes.io/projected/5fbf81dd-6a9e-4c2b-a46a-1461a3a5b094-kube-api-access-zqsb9\") pod \"5fbf81dd-6a9e-4c2b-a46a-1461a3a5b094\" (UID: \"5fbf81dd-6a9e-4c2b-a46a-1461a3a5b094\") " Feb 17 00:29:54 crc kubenswrapper[4791]: I0217 00:29:54.516539 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fbf81dd-6a9e-4c2b-a46a-1461a3a5b094-kube-api-access-zqsb9" (OuterVolumeSpecName: "kube-api-access-zqsb9") pod "5fbf81dd-6a9e-4c2b-a46a-1461a3a5b094" (UID: "5fbf81dd-6a9e-4c2b-a46a-1461a3a5b094"). InnerVolumeSpecName "kube-api-access-zqsb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:29:54 crc kubenswrapper[4791]: I0217 00:29:54.550338 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_curl_5fbf81dd-6a9e-4c2b-a46a-1461a3a5b094/curl/0.log" Feb 17 00:29:54 crc kubenswrapper[4791]: I0217 00:29:54.609606 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqsb9\" (UniqueName: \"kubernetes.io/projected/5fbf81dd-6a9e-4c2b-a46a-1461a3a5b094-kube-api-access-zqsb9\") on node \"crc\" DevicePath \"\"" Feb 17 00:29:54 crc kubenswrapper[4791]: I0217 00:29:54.804215 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-z8m2z_135a7c9c-cdfa-4baa-a4b3-ea9f6392d1cd/prometheus-webhook-snmp/0.log" Feb 17 00:29:54 crc kubenswrapper[4791]: I0217 00:29:54.972656 4791 patch_prober.go:28] interesting pod/machine-config-daemon-9klkw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 00:29:54 crc kubenswrapper[4791]: I0217 00:29:54.972717 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 00:29:55 crc kubenswrapper[4791]: I0217 00:29:55.245229 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"5fbf81dd-6a9e-4c2b-a46a-1461a3a5b094","Type":"ContainerDied","Data":"690037d92f4fa5c73c59322e6169368ac5590df450d4325ba298a0d94081d46b"} Feb 17 00:29:55 crc kubenswrapper[4791]: I0217 00:29:55.245274 4791 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="690037d92f4fa5c73c59322e6169368ac5590df450d4325ba298a0d94081d46b" Feb 17 00:29:55 crc kubenswrapper[4791]: I0217 00:29:55.245337 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Feb 17 00:29:59 crc kubenswrapper[4791]: I0217 00:29:59.271335 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-k8frg" event={"ID":"e8590bb8-aabf-4bf2-b28d-f1c7b0872780","Type":"ContainerStarted","Data":"92e80ec48707a383e8e983cd6a7e22c9b6a5c7d87140df423504e893b8aba7ed"} Feb 17 00:29:59 crc kubenswrapper[4791]: I0217 00:29:59.274641 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" event={"ID":"e3d9a725-2f9c-4fcc-8610-4b297a3d689d","Type":"ContainerStarted","Data":"1cefd2a346da979f0d4271d76cfe3cfaaaa0b7b2c62557a9196303556d822ac9"} Feb 17 00:29:59 crc kubenswrapper[4791]: I0217 00:29:59.276604 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m" event={"ID":"21f15fa5-1f89-4aae-b6df-6a7c33630f43","Type":"ContainerStarted","Data":"0913487f3336339a89c2c386a8b3f9c8ee342543562c2a1621a25e49b0185945"} Feb 17 00:30:00 crc kubenswrapper[4791]: I0217 00:30:00.142679 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521470-l9cv9"] Feb 17 00:30:00 crc kubenswrapper[4791]: E0217 00:30:00.143326 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fbf81dd-6a9e-4c2b-a46a-1461a3a5b094" containerName="curl" Feb 17 00:30:00 crc kubenswrapper[4791]: I0217 00:30:00.143348 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fbf81dd-6a9e-4c2b-a46a-1461a3a5b094" containerName="curl" Feb 17 00:30:00 crc kubenswrapper[4791]: I0217 00:30:00.143457 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fbf81dd-6a9e-4c2b-a46a-1461a3a5b094" containerName="curl" Feb 17 00:30:00 crc kubenswrapper[4791]: I0217 00:30:00.143900 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521470-l9cv9" Feb 17 00:30:00 crc kubenswrapper[4791]: I0217 00:30:00.145949 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 00:30:00 crc kubenswrapper[4791]: I0217 00:30:00.146962 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 00:30:00 crc kubenswrapper[4791]: I0217 00:30:00.151680 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521470-l9cv9"] Feb 17 00:30:00 crc kubenswrapper[4791]: I0217 00:30:00.196861 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8399bca3-cb20-4884-9b21-8ec3dae4c326-secret-volume\") pod \"collect-profiles-29521470-l9cv9\" (UID: \"8399bca3-cb20-4884-9b21-8ec3dae4c326\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521470-l9cv9" Feb 17 00:30:00 crc kubenswrapper[4791]: I0217 00:30:00.196954 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjpgh\" (UniqueName: \"kubernetes.io/projected/8399bca3-cb20-4884-9b21-8ec3dae4c326-kube-api-access-fjpgh\") pod \"collect-profiles-29521470-l9cv9\" (UID: \"8399bca3-cb20-4884-9b21-8ec3dae4c326\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521470-l9cv9" Feb 17 00:30:00 crc kubenswrapper[4791]: I0217 00:30:00.196992 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8399bca3-cb20-4884-9b21-8ec3dae4c326-config-volume\") pod \"collect-profiles-29521470-l9cv9\" (UID: \"8399bca3-cb20-4884-9b21-8ec3dae4c326\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521470-l9cv9" Feb 17 00:30:00 crc kubenswrapper[4791]: I0217 00:30:00.298702 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8399bca3-cb20-4884-9b21-8ec3dae4c326-config-volume\") pod \"collect-profiles-29521470-l9cv9\" (UID: \"8399bca3-cb20-4884-9b21-8ec3dae4c326\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521470-l9cv9" Feb 17 00:30:00 crc kubenswrapper[4791]: I0217 00:30:00.298828 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8399bca3-cb20-4884-9b21-8ec3dae4c326-secret-volume\") pod \"collect-profiles-29521470-l9cv9\" (UID: \"8399bca3-cb20-4884-9b21-8ec3dae4c326\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521470-l9cv9" Feb 17 00:30:00 crc kubenswrapper[4791]: I0217 00:30:00.298888 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjpgh\" (UniqueName: \"kubernetes.io/projected/8399bca3-cb20-4884-9b21-8ec3dae4c326-kube-api-access-fjpgh\") pod \"collect-profiles-29521470-l9cv9\" (UID: \"8399bca3-cb20-4884-9b21-8ec3dae4c326\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521470-l9cv9" Feb 17 00:30:00 crc kubenswrapper[4791]: I0217 00:30:00.299764 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8399bca3-cb20-4884-9b21-8ec3dae4c326-config-volume\") pod \"collect-profiles-29521470-l9cv9\" (UID: \"8399bca3-cb20-4884-9b21-8ec3dae4c326\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521470-l9cv9" Feb 17 00:30:00 crc kubenswrapper[4791]: I0217 00:30:00.312000 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8399bca3-cb20-4884-9b21-8ec3dae4c326-secret-volume\") pod \"collect-profiles-29521470-l9cv9\" (UID: \"8399bca3-cb20-4884-9b21-8ec3dae4c326\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521470-l9cv9" Feb 17 00:30:00 crc kubenswrapper[4791]: I0217 00:30:00.329051 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjpgh\" (UniqueName: \"kubernetes.io/projected/8399bca3-cb20-4884-9b21-8ec3dae4c326-kube-api-access-fjpgh\") pod \"collect-profiles-29521470-l9cv9\" (UID: \"8399bca3-cb20-4884-9b21-8ec3dae4c326\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521470-l9cv9" Feb 17 00:30:00 crc kubenswrapper[4791]: I0217 00:30:00.460599 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521470-l9cv9" Feb 17 00:30:00 crc kubenswrapper[4791]: I0217 00:30:00.925842 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521470-l9cv9"] Feb 17 00:30:08 crc kubenswrapper[4791]: I0217 00:30:08.363673 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521470-l9cv9" event={"ID":"8399bca3-cb20-4884-9b21-8ec3dae4c326","Type":"ContainerStarted","Data":"8b3cc8284e64a2f5812158dddd5a1b23b6c0e660df4f9cc40fa53f34e218ece8"} Feb 17 00:30:09 crc kubenswrapper[4791]: I0217 00:30:09.373428 4791 generic.go:334] "Generic (PLEG): container finished" podID="8399bca3-cb20-4884-9b21-8ec3dae4c326" containerID="6dc8ea3daa418770ab1f163521fedb78769049fea4d06277ba9a21c58b357e21" exitCode=0 Feb 17 00:30:09 crc kubenswrapper[4791]: I0217 00:30:09.374220 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521470-l9cv9" event={"ID":"8399bca3-cb20-4884-9b21-8ec3dae4c326","Type":"ContainerDied","Data":"6dc8ea3daa418770ab1f163521fedb78769049fea4d06277ba9a21c58b357e21"} Feb 17 00:30:09 crc kubenswrapper[4791]: I0217 00:30:09.376760 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-k8frg" event={"ID":"e8590bb8-aabf-4bf2-b28d-f1c7b0872780","Type":"ContainerStarted","Data":"d13f5fd2822d8c39c1a9232da45b4fcc7917aa08355fdc5dc30d6280c800f247"} Feb 17 00:30:09 crc kubenswrapper[4791]: I0217 00:30:09.429505 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/stf-smoketest-smoke1-k8frg" podStartSLOduration=2.304329313 podStartE2EDuration="22.429486391s" podCreationTimestamp="2026-02-17 00:29:47 +0000 UTC" firstStartedPulling="2026-02-17 00:29:48.110687048 +0000 UTC m=+1445.590199575" lastFinishedPulling="2026-02-17 00:30:08.235844126 +0000 UTC m=+1465.715356653" observedRunningTime="2026-02-17 00:30:09.418524451 +0000 UTC m=+1466.898036978" watchObservedRunningTime="2026-02-17 00:30:09.429486391 +0000 UTC m=+1466.908998908" Feb 17 00:30:10 crc kubenswrapper[4791]: I0217 00:30:10.687345 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521470-l9cv9" Feb 17 00:30:10 crc kubenswrapper[4791]: I0217 00:30:10.768190 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8399bca3-cb20-4884-9b21-8ec3dae4c326-secret-volume\") pod \"8399bca3-cb20-4884-9b21-8ec3dae4c326\" (UID: \"8399bca3-cb20-4884-9b21-8ec3dae4c326\") " Feb 17 00:30:10 crc kubenswrapper[4791]: I0217 00:30:10.768297 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjpgh\" (UniqueName: \"kubernetes.io/projected/8399bca3-cb20-4884-9b21-8ec3dae4c326-kube-api-access-fjpgh\") pod \"8399bca3-cb20-4884-9b21-8ec3dae4c326\" (UID: \"8399bca3-cb20-4884-9b21-8ec3dae4c326\") " Feb 17 00:30:10 crc kubenswrapper[4791]: I0217 00:30:10.768424 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8399bca3-cb20-4884-9b21-8ec3dae4c326-config-volume\") pod \"8399bca3-cb20-4884-9b21-8ec3dae4c326\" (UID: \"8399bca3-cb20-4884-9b21-8ec3dae4c326\") " Feb 17 00:30:10 crc kubenswrapper[4791]: I0217 00:30:10.769145 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8399bca3-cb20-4884-9b21-8ec3dae4c326-config-volume" (OuterVolumeSpecName: "config-volume") pod "8399bca3-cb20-4884-9b21-8ec3dae4c326" (UID: "8399bca3-cb20-4884-9b21-8ec3dae4c326"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:30:10 crc kubenswrapper[4791]: I0217 00:30:10.773371 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8399bca3-cb20-4884-9b21-8ec3dae4c326-kube-api-access-fjpgh" (OuterVolumeSpecName: "kube-api-access-fjpgh") pod "8399bca3-cb20-4884-9b21-8ec3dae4c326" (UID: "8399bca3-cb20-4884-9b21-8ec3dae4c326"). InnerVolumeSpecName "kube-api-access-fjpgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:30:10 crc kubenswrapper[4791]: I0217 00:30:10.776235 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8399bca3-cb20-4884-9b21-8ec3dae4c326-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8399bca3-cb20-4884-9b21-8ec3dae4c326" (UID: "8399bca3-cb20-4884-9b21-8ec3dae4c326"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:30:10 crc kubenswrapper[4791]: I0217 00:30:10.870259 4791 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8399bca3-cb20-4884-9b21-8ec3dae4c326-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 00:30:10 crc kubenswrapper[4791]: I0217 00:30:10.870320 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjpgh\" (UniqueName: \"kubernetes.io/projected/8399bca3-cb20-4884-9b21-8ec3dae4c326-kube-api-access-fjpgh\") on node \"crc\" DevicePath \"\"" Feb 17 00:30:10 crc kubenswrapper[4791]: I0217 00:30:10.870340 4791 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8399bca3-cb20-4884-9b21-8ec3dae4c326-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 00:30:11 crc kubenswrapper[4791]: I0217 00:30:11.396840 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521470-l9cv9" event={"ID":"8399bca3-cb20-4884-9b21-8ec3dae4c326","Type":"ContainerDied","Data":"8b3cc8284e64a2f5812158dddd5a1b23b6c0e660df4f9cc40fa53f34e218ece8"} Feb 17 00:30:11 crc kubenswrapper[4791]: I0217 00:30:11.396882 4791 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b3cc8284e64a2f5812158dddd5a1b23b6c0e660df4f9cc40fa53f34e218ece8" Feb 17 00:30:11 crc kubenswrapper[4791]: I0217 00:30:11.396932 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521470-l9cv9" Feb 17 00:30:24 crc kubenswrapper[4791]: I0217 00:30:24.973632 4791 patch_prober.go:28] interesting pod/machine-config-daemon-9klkw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 00:30:24 crc kubenswrapper[4791]: I0217 00:30:24.974534 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 00:30:24 crc kubenswrapper[4791]: I0217 00:30:24.974588 4791 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" Feb 17 00:30:24 crc kubenswrapper[4791]: I0217 00:30:24.975324 4791 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0ab0ea25cf5596643285f502958b16b41d67465e6ae2b0bd26ff7c3c0dfd7a57"} pod="openshift-machine-config-operator/machine-config-daemon-9klkw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 00:30:24 crc kubenswrapper[4791]: I0217 00:30:24.975394 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" containerID="cri-o://0ab0ea25cf5596643285f502958b16b41d67465e6ae2b0bd26ff7c3c0dfd7a57" gracePeriod=600 Feb 17 00:30:24 crc kubenswrapper[4791]: I0217 00:30:24.984856 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-z8m2z_135a7c9c-cdfa-4baa-a4b3-ea9f6392d1cd/prometheus-webhook-snmp/0.log" Feb 17 00:30:25 crc kubenswrapper[4791]: I0217 00:30:25.515703 4791 generic.go:334] "Generic (PLEG): container finished" podID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerID="0ab0ea25cf5596643285f502958b16b41d67465e6ae2b0bd26ff7c3c0dfd7a57" exitCode=0 Feb 17 00:30:25 crc kubenswrapper[4791]: I0217 00:30:25.515841 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" event={"ID":"02a3a228-86d6-4d54-ad63-0d36c9d59af5","Type":"ContainerDied","Data":"0ab0ea25cf5596643285f502958b16b41d67465e6ae2b0bd26ff7c3c0dfd7a57"} Feb 17 00:30:25 crc kubenswrapper[4791]: I0217 00:30:25.515868 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" event={"ID":"02a3a228-86d6-4d54-ad63-0d36c9d59af5","Type":"ContainerStarted","Data":"83a45c14446dd0994ae78aba5b94b4303c07e73628e6156002bb2f0eb1460104"} Feb 17 00:30:25 crc kubenswrapper[4791]: I0217 00:30:25.515883 4791 scope.go:117] "RemoveContainer" containerID="71eb8be49ce7a773ea08603b5bc6a4c5e5daf5e96a1784fa4c29b0f51c425ed1" Feb 17 00:30:32 crc kubenswrapper[4791]: I0217 00:30:32.575235 4791 generic.go:334] "Generic (PLEG): container finished" podID="e8590bb8-aabf-4bf2-b28d-f1c7b0872780" containerID="92e80ec48707a383e8e983cd6a7e22c9b6a5c7d87140df423504e893b8aba7ed" exitCode=0 Feb 17 00:30:32 crc kubenswrapper[4791]: I0217 00:30:32.575342 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-k8frg" event={"ID":"e8590bb8-aabf-4bf2-b28d-f1c7b0872780","Type":"ContainerDied","Data":"92e80ec48707a383e8e983cd6a7e22c9b6a5c7d87140df423504e893b8aba7ed"} Feb 17 00:30:32 crc kubenswrapper[4791]: I0217 00:30:32.576262 4791 scope.go:117] "RemoveContainer" containerID="92e80ec48707a383e8e983cd6a7e22c9b6a5c7d87140df423504e893b8aba7ed" Feb 17 00:30:40 crc kubenswrapper[4791]: I0217 00:30:40.649994 4791 generic.go:334] "Generic (PLEG): container finished" podID="e8590bb8-aabf-4bf2-b28d-f1c7b0872780" containerID="d13f5fd2822d8c39c1a9232da45b4fcc7917aa08355fdc5dc30d6280c800f247" exitCode=0 Feb 17 00:30:40 crc kubenswrapper[4791]: I0217 00:30:40.650541 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-k8frg" event={"ID":"e8590bb8-aabf-4bf2-b28d-f1c7b0872780","Type":"ContainerDied","Data":"d13f5fd2822d8c39c1a9232da45b4fcc7917aa08355fdc5dc30d6280c800f247"} Feb 17 00:30:41 crc kubenswrapper[4791]: I0217 00:30:41.919044 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:30:41 crc kubenswrapper[4791]: I0217 00:30:41.969463 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-collectd-entrypoint-script\") pod \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " Feb 17 00:30:41 crc kubenswrapper[4791]: I0217 00:30:41.969526 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-healthcheck-log\") pod \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " Feb 17 00:30:41 crc kubenswrapper[4791]: I0217 00:30:41.969559 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-sensubility-config\") pod \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " Feb 17 00:30:41 crc kubenswrapper[4791]: I0217 00:30:41.969617 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-ceilometer-entrypoint-script\") pod \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " Feb 17 00:30:41 crc kubenswrapper[4791]: I0217 00:30:41.969648 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsx9p\" (UniqueName: \"kubernetes.io/projected/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-kube-api-access-lsx9p\") pod \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " Feb 17 00:30:41 crc kubenswrapper[4791]: I0217 00:30:41.969707 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-collectd-config\") pod \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " Feb 17 00:30:41 crc kubenswrapper[4791]: I0217 00:30:41.969764 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-ceilometer-publisher\") pod \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " Feb 17 00:30:41 crc kubenswrapper[4791]: I0217 00:30:41.975462 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-kube-api-access-lsx9p" (OuterVolumeSpecName: "kube-api-access-lsx9p") pod "e8590bb8-aabf-4bf2-b28d-f1c7b0872780" (UID: "e8590bb8-aabf-4bf2-b28d-f1c7b0872780"). InnerVolumeSpecName "kube-api-access-lsx9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:30:41 crc kubenswrapper[4791]: I0217 00:30:41.985816 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-ceilometer-entrypoint-script" (OuterVolumeSpecName: "ceilometer-entrypoint-script") pod "e8590bb8-aabf-4bf2-b28d-f1c7b0872780" (UID: "e8590bb8-aabf-4bf2-b28d-f1c7b0872780"). InnerVolumeSpecName "ceilometer-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:30:41 crc kubenswrapper[4791]: I0217 00:30:41.986683 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-ceilometer-publisher" (OuterVolumeSpecName: "ceilometer-publisher") pod "e8590bb8-aabf-4bf2-b28d-f1c7b0872780" (UID: "e8590bb8-aabf-4bf2-b28d-f1c7b0872780"). InnerVolumeSpecName "ceilometer-publisher". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:30:41 crc kubenswrapper[4791]: I0217 00:30:41.988703 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-healthcheck-log" (OuterVolumeSpecName: "healthcheck-log") pod "e8590bb8-aabf-4bf2-b28d-f1c7b0872780" (UID: "e8590bb8-aabf-4bf2-b28d-f1c7b0872780"). InnerVolumeSpecName "healthcheck-log". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:30:41 crc kubenswrapper[4791]: I0217 00:30:41.990181 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-sensubility-config" (OuterVolumeSpecName: "sensubility-config") pod "e8590bb8-aabf-4bf2-b28d-f1c7b0872780" (UID: "e8590bb8-aabf-4bf2-b28d-f1c7b0872780"). InnerVolumeSpecName "sensubility-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:30:41 crc kubenswrapper[4791]: I0217 00:30:41.992974 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-collectd-entrypoint-script" (OuterVolumeSpecName: "collectd-entrypoint-script") pod "e8590bb8-aabf-4bf2-b28d-f1c7b0872780" (UID: "e8590bb8-aabf-4bf2-b28d-f1c7b0872780"). InnerVolumeSpecName "collectd-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:30:41 crc kubenswrapper[4791]: I0217 00:30:41.999267 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-collectd-config" (OuterVolumeSpecName: "collectd-config") pod "e8590bb8-aabf-4bf2-b28d-f1c7b0872780" (UID: "e8590bb8-aabf-4bf2-b28d-f1c7b0872780"). InnerVolumeSpecName "collectd-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:30:42 crc kubenswrapper[4791]: I0217 00:30:42.071958 4791 reconciler_common.go:293] "Volume detached for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-ceilometer-entrypoint-script\") on node \"crc\" DevicePath \"\"" Feb 17 00:30:42 crc kubenswrapper[4791]: I0217 00:30:42.072012 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsx9p\" (UniqueName: \"kubernetes.io/projected/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-kube-api-access-lsx9p\") on node \"crc\" DevicePath \"\"" Feb 17 00:30:42 crc kubenswrapper[4791]: I0217 00:30:42.072030 4791 reconciler_common.go:293] "Volume detached for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-collectd-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:30:42 crc kubenswrapper[4791]: I0217 00:30:42.072042 4791 reconciler_common.go:293] "Volume detached for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-ceilometer-publisher\") on node \"crc\" DevicePath \"\"" Feb 17 00:30:42 crc kubenswrapper[4791]: I0217 00:30:42.072054 4791 reconciler_common.go:293] "Volume detached for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-collectd-entrypoint-script\") on node \"crc\" DevicePath \"\"" Feb 17 00:30:42 crc kubenswrapper[4791]: I0217 00:30:42.072066 4791 reconciler_common.go:293] "Volume detached for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-healthcheck-log\") on node \"crc\" DevicePath \"\"" Feb 17 00:30:42 crc kubenswrapper[4791]: I0217 00:30:42.072078 4791 reconciler_common.go:293] "Volume detached for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-sensubility-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:30:42 crc kubenswrapper[4791]: I0217 00:30:42.672736 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-k8frg" event={"ID":"e8590bb8-aabf-4bf2-b28d-f1c7b0872780","Type":"ContainerDied","Data":"fb78639120c2ceb67338064e7f2d7aba445e0c94d0f5aa427da45e085399f0fb"} Feb 17 00:30:42 crc kubenswrapper[4791]: I0217 00:30:42.672995 4791 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb78639120c2ceb67338064e7f2d7aba445e0c94d0f5aa427da45e085399f0fb" Feb 17 00:30:42 crc kubenswrapper[4791]: I0217 00:30:42.672875 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:30:44 crc kubenswrapper[4791]: I0217 00:30:44.078016 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_stf-smoketest-smoke1-k8frg_e8590bb8-aabf-4bf2-b28d-f1c7b0872780/smoketest-collectd/0.log" Feb 17 00:30:44 crc kubenswrapper[4791]: I0217 00:30:44.353030 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_stf-smoketest-smoke1-k8frg_e8590bb8-aabf-4bf2-b28d-f1c7b0872780/smoketest-ceilometer/0.log" Feb 17 00:30:44 crc kubenswrapper[4791]: I0217 00:30:44.656941 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-interconnect-68864d46cb-sjw5t_9caafe24-6ee4-425b-b175-c0901dab223f/default-interconnect/0.log" Feb 17 00:30:44 crc kubenswrapper[4791]: I0217 00:30:44.919285 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8_0d7ac416-17e0-4f86-8786-0afdec7fc240/bridge/2.log" Feb 17 00:30:45 crc kubenswrapper[4791]: I0217 00:30:45.204056 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8_0d7ac416-17e0-4f86-8786-0afdec7fc240/sg-core/0.log" Feb 17 00:30:45 crc kubenswrapper[4791]: I0217 00:30:45.545555 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk_4634db00-8e3c-4569-b66c-ef549eda9204/bridge/2.log" Feb 17 00:30:45 crc kubenswrapper[4791]: I0217 00:30:45.857303 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk_4634db00-8e3c-4569-b66c-ef549eda9204/sg-core/0.log" Feb 17 00:30:46 crc kubenswrapper[4791]: I0217 00:30:46.170564 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9_e3d9a725-2f9c-4fcc-8610-4b297a3d689d/bridge/2.log" Feb 17 00:30:46 crc kubenswrapper[4791]: I0217 00:30:46.466278 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9_e3d9a725-2f9c-4fcc-8610-4b297a3d689d/sg-core/0.log" Feb 17 00:30:46 crc kubenswrapper[4791]: I0217 00:30:46.790200 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m_21f15fa5-1f89-4aae-b6df-6a7c33630f43/bridge/2.log" Feb 17 00:30:47 crc kubenswrapper[4791]: I0217 00:30:47.041482 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m_21f15fa5-1f89-4aae-b6df-6a7c33630f43/sg-core/0.log" Feb 17 00:30:47 crc kubenswrapper[4791]: I0217 00:30:47.294913 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p_6e985bca-5f71-47e0-bd63-ede2ad79bd7e/bridge/2.log" Feb 17 00:30:47 crc kubenswrapper[4791]: I0217 00:30:47.571447 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p_6e985bca-5f71-47e0-bd63-ede2ad79bd7e/sg-core/0.log" Feb 17 00:30:51 crc kubenswrapper[4791]: I0217 00:30:51.573924 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-6f787cb998-k5dw6_29c809ce-6a9b-4496-9c8e-8cd4506d926b/operator/0.log" Feb 17 00:30:51 crc kubenswrapper[4791]: I0217 00:30:51.920427 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-default-0_af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9/prometheus/0.log" Feb 17 00:30:52 crc kubenswrapper[4791]: I0217 00:30:52.200767 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_elasticsearch-es-default-0_335ade17-e7c1-487c-9e12-ad3d0d3610b0/elasticsearch/0.log" Feb 17 00:30:52 crc kubenswrapper[4791]: I0217 00:30:52.538150 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-z8m2z_135a7c9c-cdfa-4baa-a4b3-ea9f6392d1cd/prometheus-webhook-snmp/0.log" Feb 17 00:30:52 crc kubenswrapper[4791]: I0217 00:30:52.864035 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_alertmanager-default-0_df247d19-621c-4c9b-a436-d4f263dcb5ae/alertmanager/0.log" Feb 17 00:30:58 crc kubenswrapper[4791]: I0217 00:30:58.201527 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kj28p"] Feb 17 00:30:58 crc kubenswrapper[4791]: E0217 00:30:58.202368 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8590bb8-aabf-4bf2-b28d-f1c7b0872780" containerName="smoketest-collectd" Feb 17 00:30:58 crc kubenswrapper[4791]: I0217 00:30:58.202385 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8590bb8-aabf-4bf2-b28d-f1c7b0872780" containerName="smoketest-collectd" Feb 17 00:30:58 crc kubenswrapper[4791]: E0217 00:30:58.202406 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8399bca3-cb20-4884-9b21-8ec3dae4c326" containerName="collect-profiles" Feb 17 00:30:58 crc kubenswrapper[4791]: I0217 00:30:58.202414 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="8399bca3-cb20-4884-9b21-8ec3dae4c326" containerName="collect-profiles" Feb 17 00:30:58 crc kubenswrapper[4791]: E0217 00:30:58.202434 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8590bb8-aabf-4bf2-b28d-f1c7b0872780" containerName="smoketest-ceilometer" Feb 17 00:30:58 crc kubenswrapper[4791]: I0217 00:30:58.202443 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8590bb8-aabf-4bf2-b28d-f1c7b0872780" containerName="smoketest-ceilometer" Feb 17 00:30:58 crc kubenswrapper[4791]: I0217 00:30:58.202576 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8590bb8-aabf-4bf2-b28d-f1c7b0872780" containerName="smoketest-ceilometer" Feb 17 00:30:58 crc kubenswrapper[4791]: I0217 00:30:58.202590 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8590bb8-aabf-4bf2-b28d-f1c7b0872780" containerName="smoketest-collectd" Feb 17 00:30:58 crc kubenswrapper[4791]: I0217 00:30:58.202602 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="8399bca3-cb20-4884-9b21-8ec3dae4c326" containerName="collect-profiles" Feb 17 00:30:58 crc kubenswrapper[4791]: I0217 00:30:58.203676 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kj28p" Feb 17 00:30:58 crc kubenswrapper[4791]: I0217 00:30:58.218511 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kj28p"] Feb 17 00:30:58 crc kubenswrapper[4791]: I0217 00:30:58.318847 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b014ddb5-2da5-47ba-85bd-4e93e8f670c4-utilities\") pod \"community-operators-kj28p\" (UID: \"b014ddb5-2da5-47ba-85bd-4e93e8f670c4\") " pod="openshift-marketplace/community-operators-kj28p" Feb 17 00:30:58 crc kubenswrapper[4791]: I0217 00:30:58.319003 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztf7b\" (UniqueName: \"kubernetes.io/projected/b014ddb5-2da5-47ba-85bd-4e93e8f670c4-kube-api-access-ztf7b\") pod \"community-operators-kj28p\" (UID: \"b014ddb5-2da5-47ba-85bd-4e93e8f670c4\") " pod="openshift-marketplace/community-operators-kj28p" Feb 17 00:30:58 crc kubenswrapper[4791]: I0217 00:30:58.319041 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b014ddb5-2da5-47ba-85bd-4e93e8f670c4-catalog-content\") pod \"community-operators-kj28p\" (UID: \"b014ddb5-2da5-47ba-85bd-4e93e8f670c4\") " pod="openshift-marketplace/community-operators-kj28p" Feb 17 00:30:58 crc kubenswrapper[4791]: I0217 00:30:58.420100 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b014ddb5-2da5-47ba-85bd-4e93e8f670c4-utilities\") pod \"community-operators-kj28p\" (UID: \"b014ddb5-2da5-47ba-85bd-4e93e8f670c4\") " pod="openshift-marketplace/community-operators-kj28p" Feb 17 00:30:58 crc kubenswrapper[4791]: I0217 00:30:58.420213 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztf7b\" (UniqueName: \"kubernetes.io/projected/b014ddb5-2da5-47ba-85bd-4e93e8f670c4-kube-api-access-ztf7b\") pod \"community-operators-kj28p\" (UID: \"b014ddb5-2da5-47ba-85bd-4e93e8f670c4\") " pod="openshift-marketplace/community-operators-kj28p" Feb 17 00:30:58 crc kubenswrapper[4791]: I0217 00:30:58.420242 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b014ddb5-2da5-47ba-85bd-4e93e8f670c4-catalog-content\") pod \"community-operators-kj28p\" (UID: \"b014ddb5-2da5-47ba-85bd-4e93e8f670c4\") " pod="openshift-marketplace/community-operators-kj28p" Feb 17 00:30:58 crc kubenswrapper[4791]: I0217 00:30:58.420790 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b014ddb5-2da5-47ba-85bd-4e93e8f670c4-catalog-content\") pod \"community-operators-kj28p\" (UID: \"b014ddb5-2da5-47ba-85bd-4e93e8f670c4\") " pod="openshift-marketplace/community-operators-kj28p" Feb 17 00:30:58 crc kubenswrapper[4791]: I0217 00:30:58.421068 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b014ddb5-2da5-47ba-85bd-4e93e8f670c4-utilities\") pod \"community-operators-kj28p\" (UID: \"b014ddb5-2da5-47ba-85bd-4e93e8f670c4\") " pod="openshift-marketplace/community-operators-kj28p" Feb 17 00:30:58 crc kubenswrapper[4791]: I0217 00:30:58.445574 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztf7b\" (UniqueName: \"kubernetes.io/projected/b014ddb5-2da5-47ba-85bd-4e93e8f670c4-kube-api-access-ztf7b\") pod \"community-operators-kj28p\" (UID: \"b014ddb5-2da5-47ba-85bd-4e93e8f670c4\") " pod="openshift-marketplace/community-operators-kj28p" Feb 17 00:30:58 crc kubenswrapper[4791]: I0217 00:30:58.519834 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kj28p" Feb 17 00:30:59 crc kubenswrapper[4791]: I0217 00:30:59.015939 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kj28p"] Feb 17 00:30:59 crc kubenswrapper[4791]: I0217 00:30:59.810630 4791 generic.go:334] "Generic (PLEG): container finished" podID="b014ddb5-2da5-47ba-85bd-4e93e8f670c4" containerID="a6e367ac52fd977a5b414318cc5fa96f46c1094fadde60a04794646e24a14aa5" exitCode=0 Feb 17 00:30:59 crc kubenswrapper[4791]: I0217 00:30:59.810742 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kj28p" event={"ID":"b014ddb5-2da5-47ba-85bd-4e93e8f670c4","Type":"ContainerDied","Data":"a6e367ac52fd977a5b414318cc5fa96f46c1094fadde60a04794646e24a14aa5"} Feb 17 00:30:59 crc kubenswrapper[4791]: I0217 00:30:59.810919 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kj28p" event={"ID":"b014ddb5-2da5-47ba-85bd-4e93e8f670c4","Type":"ContainerStarted","Data":"71a6b3b5f0cefd2907aba0a8a16ba8ee674dbdf8214bb14dcede67bd40deffee"} Feb 17 00:31:00 crc kubenswrapper[4791]: I0217 00:31:00.830327 4791 generic.go:334] "Generic (PLEG): container finished" podID="b014ddb5-2da5-47ba-85bd-4e93e8f670c4" containerID="dec6d0ebc8a287558eb364bf2a9baeac2db2dbbbd7d079fe46f1a5f38f09b7d8" exitCode=0 Feb 17 00:31:00 crc kubenswrapper[4791]: I0217 00:31:00.830385 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kj28p" event={"ID":"b014ddb5-2da5-47ba-85bd-4e93e8f670c4","Type":"ContainerDied","Data":"dec6d0ebc8a287558eb364bf2a9baeac2db2dbbbd7d079fe46f1a5f38f09b7d8"} Feb 17 00:31:01 crc kubenswrapper[4791]: I0217 00:31:01.839814 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kj28p" event={"ID":"b014ddb5-2da5-47ba-85bd-4e93e8f670c4","Type":"ContainerStarted","Data":"5fce4957f3803d0bc504e549f47003789886d2c421d6af3b0117de275e094794"} Feb 17 00:31:01 crc kubenswrapper[4791]: I0217 00:31:01.859713 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kj28p" podStartSLOduration=2.462567457 podStartE2EDuration="3.859695876s" podCreationTimestamp="2026-02-17 00:30:58 +0000 UTC" firstStartedPulling="2026-02-17 00:30:59.813150083 +0000 UTC m=+1517.292662620" lastFinishedPulling="2026-02-17 00:31:01.210278512 +0000 UTC m=+1518.689791039" observedRunningTime="2026-02-17 00:31:01.855431393 +0000 UTC m=+1519.334943920" watchObservedRunningTime="2026-02-17 00:31:01.859695876 +0000 UTC m=+1519.339208403" Feb 17 00:31:07 crc kubenswrapper[4791]: I0217 00:31:07.971048 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-6f88b4fbc-h6xn7_c8e979be-fe5a-4d89-b1a6-0260fffdd27c/operator/0.log" Feb 17 00:31:08 crc kubenswrapper[4791]: I0217 00:31:08.520024 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kj28p" Feb 17 00:31:08 crc kubenswrapper[4791]: I0217 00:31:08.520097 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kj28p" Feb 17 00:31:08 crc kubenswrapper[4791]: I0217 00:31:08.596198 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kj28p" Feb 17 00:31:08 crc kubenswrapper[4791]: I0217 00:31:08.938332 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kj28p" Feb 17 00:31:08 crc kubenswrapper[4791]: I0217 00:31:08.989373 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kj28p"] Feb 17 00:31:10 crc kubenswrapper[4791]: I0217 00:31:10.918016 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kj28p" podUID="b014ddb5-2da5-47ba-85bd-4e93e8f670c4" containerName="registry-server" containerID="cri-o://5fce4957f3803d0bc504e549f47003789886d2c421d6af3b0117de275e094794" gracePeriod=2 Feb 17 00:31:11 crc kubenswrapper[4791]: I0217 00:31:11.317590 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kj28p" Feb 17 00:31:11 crc kubenswrapper[4791]: I0217 00:31:11.409780 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b014ddb5-2da5-47ba-85bd-4e93e8f670c4-catalog-content\") pod \"b014ddb5-2da5-47ba-85bd-4e93e8f670c4\" (UID: \"b014ddb5-2da5-47ba-85bd-4e93e8f670c4\") " Feb 17 00:31:11 crc kubenswrapper[4791]: I0217 00:31:11.409878 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b014ddb5-2da5-47ba-85bd-4e93e8f670c4-utilities\") pod \"b014ddb5-2da5-47ba-85bd-4e93e8f670c4\" (UID: \"b014ddb5-2da5-47ba-85bd-4e93e8f670c4\") " Feb 17 00:31:11 crc kubenswrapper[4791]: I0217 00:31:11.409956 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztf7b\" (UniqueName: \"kubernetes.io/projected/b014ddb5-2da5-47ba-85bd-4e93e8f670c4-kube-api-access-ztf7b\") pod \"b014ddb5-2da5-47ba-85bd-4e93e8f670c4\" (UID: \"b014ddb5-2da5-47ba-85bd-4e93e8f670c4\") " Feb 17 00:31:11 crc kubenswrapper[4791]: I0217 00:31:11.410853 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b014ddb5-2da5-47ba-85bd-4e93e8f670c4-utilities" (OuterVolumeSpecName: "utilities") pod "b014ddb5-2da5-47ba-85bd-4e93e8f670c4" (UID: "b014ddb5-2da5-47ba-85bd-4e93e8f670c4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:31:11 crc kubenswrapper[4791]: I0217 00:31:11.417249 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b014ddb5-2da5-47ba-85bd-4e93e8f670c4-kube-api-access-ztf7b" (OuterVolumeSpecName: "kube-api-access-ztf7b") pod "b014ddb5-2da5-47ba-85bd-4e93e8f670c4" (UID: "b014ddb5-2da5-47ba-85bd-4e93e8f670c4"). InnerVolumeSpecName "kube-api-access-ztf7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:31:11 crc kubenswrapper[4791]: I0217 00:31:11.489616 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b014ddb5-2da5-47ba-85bd-4e93e8f670c4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b014ddb5-2da5-47ba-85bd-4e93e8f670c4" (UID: "b014ddb5-2da5-47ba-85bd-4e93e8f670c4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:31:11 crc kubenswrapper[4791]: I0217 00:31:11.511655 4791 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b014ddb5-2da5-47ba-85bd-4e93e8f670c4-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 00:31:11 crc kubenswrapper[4791]: I0217 00:31:11.511719 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztf7b\" (UniqueName: \"kubernetes.io/projected/b014ddb5-2da5-47ba-85bd-4e93e8f670c4-kube-api-access-ztf7b\") on node \"crc\" DevicePath \"\"" Feb 17 00:31:11 crc kubenswrapper[4791]: I0217 00:31:11.511745 4791 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b014ddb5-2da5-47ba-85bd-4e93e8f670c4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 00:31:11 crc kubenswrapper[4791]: I0217 00:31:11.926361 4791 generic.go:334] "Generic (PLEG): container finished" podID="b014ddb5-2da5-47ba-85bd-4e93e8f670c4" containerID="5fce4957f3803d0bc504e549f47003789886d2c421d6af3b0117de275e094794" exitCode=0 Feb 17 00:31:11 crc kubenswrapper[4791]: I0217 00:31:11.926451 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kj28p" event={"ID":"b014ddb5-2da5-47ba-85bd-4e93e8f670c4","Type":"ContainerDied","Data":"5fce4957f3803d0bc504e549f47003789886d2c421d6af3b0117de275e094794"} Feb 17 00:31:11 crc kubenswrapper[4791]: I0217 00:31:11.926491 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kj28p" event={"ID":"b014ddb5-2da5-47ba-85bd-4e93e8f670c4","Type":"ContainerDied","Data":"71a6b3b5f0cefd2907aba0a8a16ba8ee674dbdf8214bb14dcede67bd40deffee"} Feb 17 00:31:11 crc kubenswrapper[4791]: I0217 00:31:11.926520 4791 scope.go:117] "RemoveContainer" containerID="5fce4957f3803d0bc504e549f47003789886d2c421d6af3b0117de275e094794" Feb 17 00:31:11 crc kubenswrapper[4791]: I0217 00:31:11.926564 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kj28p" Feb 17 00:31:11 crc kubenswrapper[4791]: I0217 00:31:11.947537 4791 scope.go:117] "RemoveContainer" containerID="dec6d0ebc8a287558eb364bf2a9baeac2db2dbbbd7d079fe46f1a5f38f09b7d8" Feb 17 00:31:11 crc kubenswrapper[4791]: I0217 00:31:11.962938 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kj28p"] Feb 17 00:31:11 crc kubenswrapper[4791]: I0217 00:31:11.983401 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kj28p"] Feb 17 00:31:11 crc kubenswrapper[4791]: I0217 00:31:11.989631 4791 scope.go:117] "RemoveContainer" containerID="a6e367ac52fd977a5b414318cc5fa96f46c1094fadde60a04794646e24a14aa5" Feb 17 00:31:12 crc kubenswrapper[4791]: I0217 00:31:12.003140 4791 scope.go:117] "RemoveContainer" containerID="5fce4957f3803d0bc504e549f47003789886d2c421d6af3b0117de275e094794" Feb 17 00:31:12 crc kubenswrapper[4791]: E0217 00:31:12.003577 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fce4957f3803d0bc504e549f47003789886d2c421d6af3b0117de275e094794\": container with ID starting with 5fce4957f3803d0bc504e549f47003789886d2c421d6af3b0117de275e094794 not found: ID does not exist" containerID="5fce4957f3803d0bc504e549f47003789886d2c421d6af3b0117de275e094794" Feb 17 00:31:12 crc kubenswrapper[4791]: I0217 00:31:12.003608 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fce4957f3803d0bc504e549f47003789886d2c421d6af3b0117de275e094794"} err="failed to get container status \"5fce4957f3803d0bc504e549f47003789886d2c421d6af3b0117de275e094794\": rpc error: code = NotFound desc = could not find container \"5fce4957f3803d0bc504e549f47003789886d2c421d6af3b0117de275e094794\": container with ID starting with 5fce4957f3803d0bc504e549f47003789886d2c421d6af3b0117de275e094794 not found: ID does not exist" Feb 17 00:31:12 crc kubenswrapper[4791]: I0217 00:31:12.003628 4791 scope.go:117] "RemoveContainer" containerID="dec6d0ebc8a287558eb364bf2a9baeac2db2dbbbd7d079fe46f1a5f38f09b7d8" Feb 17 00:31:12 crc kubenswrapper[4791]: E0217 00:31:12.003932 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dec6d0ebc8a287558eb364bf2a9baeac2db2dbbbd7d079fe46f1a5f38f09b7d8\": container with ID starting with dec6d0ebc8a287558eb364bf2a9baeac2db2dbbbd7d079fe46f1a5f38f09b7d8 not found: ID does not exist" containerID="dec6d0ebc8a287558eb364bf2a9baeac2db2dbbbd7d079fe46f1a5f38f09b7d8" Feb 17 00:31:12 crc kubenswrapper[4791]: I0217 00:31:12.003980 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dec6d0ebc8a287558eb364bf2a9baeac2db2dbbbd7d079fe46f1a5f38f09b7d8"} err="failed to get container status \"dec6d0ebc8a287558eb364bf2a9baeac2db2dbbbd7d079fe46f1a5f38f09b7d8\": rpc error: code = NotFound desc = could not find container \"dec6d0ebc8a287558eb364bf2a9baeac2db2dbbbd7d079fe46f1a5f38f09b7d8\": container with ID starting with dec6d0ebc8a287558eb364bf2a9baeac2db2dbbbd7d079fe46f1a5f38f09b7d8 not found: ID does not exist" Feb 17 00:31:12 crc kubenswrapper[4791]: I0217 00:31:12.004012 4791 scope.go:117] "RemoveContainer" containerID="a6e367ac52fd977a5b414318cc5fa96f46c1094fadde60a04794646e24a14aa5" Feb 17 00:31:12 crc kubenswrapper[4791]: E0217 00:31:12.004664 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6e367ac52fd977a5b414318cc5fa96f46c1094fadde60a04794646e24a14aa5\": container with ID starting with a6e367ac52fd977a5b414318cc5fa96f46c1094fadde60a04794646e24a14aa5 not found: ID does not exist" containerID="a6e367ac52fd977a5b414318cc5fa96f46c1094fadde60a04794646e24a14aa5" Feb 17 00:31:12 crc kubenswrapper[4791]: I0217 00:31:12.004757 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6e367ac52fd977a5b414318cc5fa96f46c1094fadde60a04794646e24a14aa5"} err="failed to get container status \"a6e367ac52fd977a5b414318cc5fa96f46c1094fadde60a04794646e24a14aa5\": rpc error: code = NotFound desc = could not find container \"a6e367ac52fd977a5b414318cc5fa96f46c1094fadde60a04794646e24a14aa5\": container with ID starting with a6e367ac52fd977a5b414318cc5fa96f46c1094fadde60a04794646e24a14aa5 not found: ID does not exist" Feb 17 00:31:12 crc kubenswrapper[4791]: I0217 00:31:12.088124 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-6f787cb998-k5dw6_29c809ce-6a9b-4496-9c8e-8cd4506d926b/operator/0.log" Feb 17 00:31:12 crc kubenswrapper[4791]: I0217 00:31:12.364135 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_qdr-test_43948ca6-1e04-4a7f-867d-d5f6d69d240d/qdr/0.log" Feb 17 00:31:13 crc kubenswrapper[4791]: I0217 00:31:13.234378 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b014ddb5-2da5-47ba-85bd-4e93e8f670c4" path="/var/lib/kubelet/pods/b014ddb5-2da5-47ba-85bd-4e93e8f670c4/volumes" Feb 17 00:31:36 crc kubenswrapper[4791]: I0217 00:31:36.928584 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-glgdj/must-gather-f94nt"] Feb 17 00:31:36 crc kubenswrapper[4791]: E0217 00:31:36.930672 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b014ddb5-2da5-47ba-85bd-4e93e8f670c4" containerName="extract-utilities" Feb 17 00:31:36 crc kubenswrapper[4791]: I0217 00:31:36.930775 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="b014ddb5-2da5-47ba-85bd-4e93e8f670c4" containerName="extract-utilities" Feb 17 00:31:36 crc kubenswrapper[4791]: E0217 00:31:36.930855 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b014ddb5-2da5-47ba-85bd-4e93e8f670c4" containerName="extract-content" Feb 17 00:31:36 crc kubenswrapper[4791]: I0217 00:31:36.930930 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="b014ddb5-2da5-47ba-85bd-4e93e8f670c4" containerName="extract-content" Feb 17 00:31:36 crc kubenswrapper[4791]: E0217 00:31:36.931007 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b014ddb5-2da5-47ba-85bd-4e93e8f670c4" containerName="registry-server" Feb 17 00:31:36 crc kubenswrapper[4791]: I0217 00:31:36.931136 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="b014ddb5-2da5-47ba-85bd-4e93e8f670c4" containerName="registry-server" Feb 17 00:31:36 crc kubenswrapper[4791]: I0217 00:31:36.931383 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="b014ddb5-2da5-47ba-85bd-4e93e8f670c4" containerName="registry-server" Feb 17 00:31:36 crc kubenswrapper[4791]: I0217 00:31:36.932535 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-glgdj/must-gather-f94nt" Feb 17 00:31:36 crc kubenswrapper[4791]: I0217 00:31:36.939707 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-glgdj"/"default-dockercfg-54jg6" Feb 17 00:31:36 crc kubenswrapper[4791]: I0217 00:31:36.942053 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-glgdj"/"kube-root-ca.crt" Feb 17 00:31:36 crc kubenswrapper[4791]: I0217 00:31:36.942256 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-glgdj"/"openshift-service-ca.crt" Feb 17 00:31:36 crc kubenswrapper[4791]: I0217 00:31:36.958505 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-glgdj/must-gather-f94nt"] Feb 17 00:31:36 crc kubenswrapper[4791]: I0217 00:31:36.995209 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4x67\" (UniqueName: \"kubernetes.io/projected/240269b5-7b03-4e43-8d40-106c95b85777-kube-api-access-p4x67\") pod \"must-gather-f94nt\" (UID: \"240269b5-7b03-4e43-8d40-106c95b85777\") " pod="openshift-must-gather-glgdj/must-gather-f94nt" Feb 17 00:31:36 crc kubenswrapper[4791]: I0217 00:31:36.995289 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/240269b5-7b03-4e43-8d40-106c95b85777-must-gather-output\") pod \"must-gather-f94nt\" (UID: \"240269b5-7b03-4e43-8d40-106c95b85777\") " pod="openshift-must-gather-glgdj/must-gather-f94nt" Feb 17 00:31:37 crc kubenswrapper[4791]: I0217 00:31:37.096972 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/240269b5-7b03-4e43-8d40-106c95b85777-must-gather-output\") pod \"must-gather-f94nt\" (UID: \"240269b5-7b03-4e43-8d40-106c95b85777\") " pod="openshift-must-gather-glgdj/must-gather-f94nt" Feb 17 00:31:37 crc kubenswrapper[4791]: I0217 00:31:37.097377 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4x67\" (UniqueName: \"kubernetes.io/projected/240269b5-7b03-4e43-8d40-106c95b85777-kube-api-access-p4x67\") pod \"must-gather-f94nt\" (UID: \"240269b5-7b03-4e43-8d40-106c95b85777\") " pod="openshift-must-gather-glgdj/must-gather-f94nt" Feb 17 00:31:37 crc kubenswrapper[4791]: I0217 00:31:37.098073 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/240269b5-7b03-4e43-8d40-106c95b85777-must-gather-output\") pod \"must-gather-f94nt\" (UID: \"240269b5-7b03-4e43-8d40-106c95b85777\") " pod="openshift-must-gather-glgdj/must-gather-f94nt" Feb 17 00:31:37 crc kubenswrapper[4791]: I0217 00:31:37.124095 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4x67\" (UniqueName: \"kubernetes.io/projected/240269b5-7b03-4e43-8d40-106c95b85777-kube-api-access-p4x67\") pod \"must-gather-f94nt\" (UID: \"240269b5-7b03-4e43-8d40-106c95b85777\") " pod="openshift-must-gather-glgdj/must-gather-f94nt" Feb 17 00:31:37 crc kubenswrapper[4791]: I0217 00:31:37.252892 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-glgdj/must-gather-f94nt" Feb 17 00:31:37 crc kubenswrapper[4791]: I0217 00:31:37.491371 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-glgdj/must-gather-f94nt"] Feb 17 00:31:38 crc kubenswrapper[4791]: I0217 00:31:38.132025 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-glgdj/must-gather-f94nt" event={"ID":"240269b5-7b03-4e43-8d40-106c95b85777","Type":"ContainerStarted","Data":"e90a6164f4d9cc20a6c3e3e71a51caefc00efd669525e95904943bf687926582"} Feb 17 00:31:46 crc kubenswrapper[4791]: I0217 00:31:46.194081 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-glgdj/must-gather-f94nt" event={"ID":"240269b5-7b03-4e43-8d40-106c95b85777","Type":"ContainerStarted","Data":"7e570ede014f3b62e7289481867c7b647495eebf2f846f5e59e0597dd23ce004"} Feb 17 00:31:46 crc kubenswrapper[4791]: I0217 00:31:46.194961 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-glgdj/must-gather-f94nt" event={"ID":"240269b5-7b03-4e43-8d40-106c95b85777","Type":"ContainerStarted","Data":"4aacef8b7dbadb146ac970bd0266869527a6bdc9bd8084d697396cb1fdecc57d"} Feb 17 00:31:46 crc kubenswrapper[4791]: I0217 00:31:46.220268 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-glgdj/must-gather-f94nt" podStartSLOduration=2.380193307 podStartE2EDuration="10.220246989s" podCreationTimestamp="2026-02-17 00:31:36 +0000 UTC" firstStartedPulling="2026-02-17 00:31:37.51394872 +0000 UTC m=+1554.993461247" lastFinishedPulling="2026-02-17 00:31:45.354002402 +0000 UTC m=+1562.833514929" observedRunningTime="2026-02-17 00:31:46.211849767 +0000 UTC m=+1563.691362304" watchObservedRunningTime="2026-02-17 00:31:46.220246989 +0000 UTC m=+1563.699759536" Feb 17 00:32:21 crc kubenswrapper[4791]: I0217 00:32:21.301299 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-h7fp9"] Feb 17 00:32:21 crc kubenswrapper[4791]: I0217 00:32:21.303462 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h7fp9" Feb 17 00:32:21 crc kubenswrapper[4791]: I0217 00:32:21.314442 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h7fp9"] Feb 17 00:32:21 crc kubenswrapper[4791]: I0217 00:32:21.394648 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2mbr\" (UniqueName: \"kubernetes.io/projected/ec42b5a3-c15d-4fef-9370-86ae9da61992-kube-api-access-g2mbr\") pod \"redhat-operators-h7fp9\" (UID: \"ec42b5a3-c15d-4fef-9370-86ae9da61992\") " pod="openshift-marketplace/redhat-operators-h7fp9" Feb 17 00:32:21 crc kubenswrapper[4791]: I0217 00:32:21.394978 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec42b5a3-c15d-4fef-9370-86ae9da61992-catalog-content\") pod \"redhat-operators-h7fp9\" (UID: \"ec42b5a3-c15d-4fef-9370-86ae9da61992\") " pod="openshift-marketplace/redhat-operators-h7fp9" Feb 17 00:32:21 crc kubenswrapper[4791]: I0217 00:32:21.395023 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec42b5a3-c15d-4fef-9370-86ae9da61992-utilities\") pod \"redhat-operators-h7fp9\" (UID: \"ec42b5a3-c15d-4fef-9370-86ae9da61992\") " pod="openshift-marketplace/redhat-operators-h7fp9" Feb 17 00:32:21 crc kubenswrapper[4791]: I0217 00:32:21.495841 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2mbr\" (UniqueName: \"kubernetes.io/projected/ec42b5a3-c15d-4fef-9370-86ae9da61992-kube-api-access-g2mbr\") pod \"redhat-operators-h7fp9\" (UID: \"ec42b5a3-c15d-4fef-9370-86ae9da61992\") " pod="openshift-marketplace/redhat-operators-h7fp9" Feb 17 00:32:21 crc kubenswrapper[4791]: I0217 00:32:21.495891 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec42b5a3-c15d-4fef-9370-86ae9da61992-catalog-content\") pod \"redhat-operators-h7fp9\" (UID: \"ec42b5a3-c15d-4fef-9370-86ae9da61992\") " pod="openshift-marketplace/redhat-operators-h7fp9" Feb 17 00:32:21 crc kubenswrapper[4791]: I0217 00:32:21.495931 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec42b5a3-c15d-4fef-9370-86ae9da61992-utilities\") pod \"redhat-operators-h7fp9\" (UID: \"ec42b5a3-c15d-4fef-9370-86ae9da61992\") " pod="openshift-marketplace/redhat-operators-h7fp9" Feb 17 00:32:21 crc kubenswrapper[4791]: I0217 00:32:21.496282 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec42b5a3-c15d-4fef-9370-86ae9da61992-catalog-content\") pod \"redhat-operators-h7fp9\" (UID: \"ec42b5a3-c15d-4fef-9370-86ae9da61992\") " pod="openshift-marketplace/redhat-operators-h7fp9" Feb 17 00:32:21 crc kubenswrapper[4791]: I0217 00:32:21.496344 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec42b5a3-c15d-4fef-9370-86ae9da61992-utilities\") pod \"redhat-operators-h7fp9\" (UID: \"ec42b5a3-c15d-4fef-9370-86ae9da61992\") " pod="openshift-marketplace/redhat-operators-h7fp9" Feb 17 00:32:21 crc kubenswrapper[4791]: I0217 00:32:21.523944 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2mbr\" (UniqueName: \"kubernetes.io/projected/ec42b5a3-c15d-4fef-9370-86ae9da61992-kube-api-access-g2mbr\") pod \"redhat-operators-h7fp9\" (UID: \"ec42b5a3-c15d-4fef-9370-86ae9da61992\") " pod="openshift-marketplace/redhat-operators-h7fp9" Feb 17 00:32:21 crc kubenswrapper[4791]: I0217 00:32:21.662756 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h7fp9" Feb 17 00:32:22 crc kubenswrapper[4791]: I0217 00:32:22.101538 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h7fp9"] Feb 17 00:32:22 crc kubenswrapper[4791]: I0217 00:32:22.485319 4791 generic.go:334] "Generic (PLEG): container finished" podID="ec42b5a3-c15d-4fef-9370-86ae9da61992" containerID="5ba0aaf55242c1e2fee9e95b6f19940cedef58731e631cb722505a8df02f3600" exitCode=0 Feb 17 00:32:22 crc kubenswrapper[4791]: I0217 00:32:22.485362 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h7fp9" event={"ID":"ec42b5a3-c15d-4fef-9370-86ae9da61992","Type":"ContainerDied","Data":"5ba0aaf55242c1e2fee9e95b6f19940cedef58731e631cb722505a8df02f3600"} Feb 17 00:32:22 crc kubenswrapper[4791]: I0217 00:32:22.485389 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h7fp9" event={"ID":"ec42b5a3-c15d-4fef-9370-86ae9da61992","Type":"ContainerStarted","Data":"e3c1ae6008c18b4a67644ca33fef834fc97a5ba76e470d56247db28c4077af92"} Feb 17 00:32:24 crc kubenswrapper[4791]: I0217 00:32:24.501818 4791 generic.go:334] "Generic (PLEG): container finished" podID="ec42b5a3-c15d-4fef-9370-86ae9da61992" containerID="2e70432ea866312b83604e2c0311fc971f1a540d9eda873767043a0d54469a5a" exitCode=0 Feb 17 00:32:24 crc kubenswrapper[4791]: I0217 00:32:24.501886 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h7fp9" event={"ID":"ec42b5a3-c15d-4fef-9370-86ae9da61992","Type":"ContainerDied","Data":"2e70432ea866312b83604e2c0311fc971f1a540d9eda873767043a0d54469a5a"} Feb 17 00:32:25 crc kubenswrapper[4791]: I0217 00:32:25.511243 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h7fp9" event={"ID":"ec42b5a3-c15d-4fef-9370-86ae9da61992","Type":"ContainerStarted","Data":"71a5a4440a95f3e5edb3331ff9c962fb10246ca6a4d78e993445f1467450d51d"} Feb 17 00:32:25 crc kubenswrapper[4791]: I0217 00:32:25.532098 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-h7fp9" podStartSLOduration=1.89400008 podStartE2EDuration="4.532077065s" podCreationTimestamp="2026-02-17 00:32:21 +0000 UTC" firstStartedPulling="2026-02-17 00:32:22.48653079 +0000 UTC m=+1599.966043307" lastFinishedPulling="2026-02-17 00:32:25.124607725 +0000 UTC m=+1602.604120292" observedRunningTime="2026-02-17 00:32:25.527407949 +0000 UTC m=+1603.006920476" watchObservedRunningTime="2026-02-17 00:32:25.532077065 +0000 UTC m=+1603.011589602" Feb 17 00:32:31 crc kubenswrapper[4791]: I0217 00:32:31.277518 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-kqr99_03d7a8df-a8a3-4b34-bd28-d554ae70875a/control-plane-machine-set-operator/0.log" Feb 17 00:32:31 crc kubenswrapper[4791]: I0217 00:32:31.445966 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-5nwz7_9c752f56-7754-4718-aea5-cb41d6ac4253/machine-api-operator/0.log" Feb 17 00:32:31 crc kubenswrapper[4791]: I0217 00:32:31.454961 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-5nwz7_9c752f56-7754-4718-aea5-cb41d6ac4253/kube-rbac-proxy/0.log" Feb 17 00:32:31 crc kubenswrapper[4791]: I0217 00:32:31.663567 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-h7fp9" Feb 17 00:32:31 crc kubenswrapper[4791]: I0217 00:32:31.663622 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-h7fp9" Feb 17 00:32:31 crc kubenswrapper[4791]: I0217 00:32:31.705798 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-h7fp9" Feb 17 00:32:32 crc kubenswrapper[4791]: I0217 00:32:32.622826 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-h7fp9" Feb 17 00:32:32 crc kubenswrapper[4791]: I0217 00:32:32.670188 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h7fp9"] Feb 17 00:32:34 crc kubenswrapper[4791]: I0217 00:32:34.585821 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-h7fp9" podUID="ec42b5a3-c15d-4fef-9370-86ae9da61992" containerName="registry-server" containerID="cri-o://71a5a4440a95f3e5edb3331ff9c962fb10246ca6a4d78e993445f1467450d51d" gracePeriod=2 Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.008943 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h7fp9" Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.128035 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2mbr\" (UniqueName: \"kubernetes.io/projected/ec42b5a3-c15d-4fef-9370-86ae9da61992-kube-api-access-g2mbr\") pod \"ec42b5a3-c15d-4fef-9370-86ae9da61992\" (UID: \"ec42b5a3-c15d-4fef-9370-86ae9da61992\") " Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.128195 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec42b5a3-c15d-4fef-9370-86ae9da61992-utilities\") pod \"ec42b5a3-c15d-4fef-9370-86ae9da61992\" (UID: \"ec42b5a3-c15d-4fef-9370-86ae9da61992\") " Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.128320 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec42b5a3-c15d-4fef-9370-86ae9da61992-catalog-content\") pod \"ec42b5a3-c15d-4fef-9370-86ae9da61992\" (UID: \"ec42b5a3-c15d-4fef-9370-86ae9da61992\") " Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.129474 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec42b5a3-c15d-4fef-9370-86ae9da61992-utilities" (OuterVolumeSpecName: "utilities") pod "ec42b5a3-c15d-4fef-9370-86ae9da61992" (UID: "ec42b5a3-c15d-4fef-9370-86ae9da61992"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.138102 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec42b5a3-c15d-4fef-9370-86ae9da61992-kube-api-access-g2mbr" (OuterVolumeSpecName: "kube-api-access-g2mbr") pod "ec42b5a3-c15d-4fef-9370-86ae9da61992" (UID: "ec42b5a3-c15d-4fef-9370-86ae9da61992"). InnerVolumeSpecName "kube-api-access-g2mbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.229795 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2mbr\" (UniqueName: \"kubernetes.io/projected/ec42b5a3-c15d-4fef-9370-86ae9da61992-kube-api-access-g2mbr\") on node \"crc\" DevicePath \"\"" Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.229842 4791 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec42b5a3-c15d-4fef-9370-86ae9da61992-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.258898 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec42b5a3-c15d-4fef-9370-86ae9da61992-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ec42b5a3-c15d-4fef-9370-86ae9da61992" (UID: "ec42b5a3-c15d-4fef-9370-86ae9da61992"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.330744 4791 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec42b5a3-c15d-4fef-9370-86ae9da61992-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.594301 4791 generic.go:334] "Generic (PLEG): container finished" podID="ec42b5a3-c15d-4fef-9370-86ae9da61992" containerID="71a5a4440a95f3e5edb3331ff9c962fb10246ca6a4d78e993445f1467450d51d" exitCode=0 Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.594378 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h7fp9" event={"ID":"ec42b5a3-c15d-4fef-9370-86ae9da61992","Type":"ContainerDied","Data":"71a5a4440a95f3e5edb3331ff9c962fb10246ca6a4d78e993445f1467450d51d"} Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.594417 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h7fp9" Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.594442 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h7fp9" event={"ID":"ec42b5a3-c15d-4fef-9370-86ae9da61992","Type":"ContainerDied","Data":"e3c1ae6008c18b4a67644ca33fef834fc97a5ba76e470d56247db28c4077af92"} Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.594483 4791 scope.go:117] "RemoveContainer" containerID="71a5a4440a95f3e5edb3331ff9c962fb10246ca6a4d78e993445f1467450d51d" Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.612822 4791 scope.go:117] "RemoveContainer" containerID="2e70432ea866312b83604e2c0311fc971f1a540d9eda873767043a0d54469a5a" Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.644793 4791 scope.go:117] "RemoveContainer" containerID="5ba0aaf55242c1e2fee9e95b6f19940cedef58731e631cb722505a8df02f3600" Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.645784 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h7fp9"] Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.654303 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-h7fp9"] Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.677944 4791 scope.go:117] "RemoveContainer" containerID="71a5a4440a95f3e5edb3331ff9c962fb10246ca6a4d78e993445f1467450d51d" Feb 17 00:32:35 crc kubenswrapper[4791]: E0217 00:32:35.678368 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71a5a4440a95f3e5edb3331ff9c962fb10246ca6a4d78e993445f1467450d51d\": container with ID starting with 71a5a4440a95f3e5edb3331ff9c962fb10246ca6a4d78e993445f1467450d51d not found: ID does not exist" containerID="71a5a4440a95f3e5edb3331ff9c962fb10246ca6a4d78e993445f1467450d51d" Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.678417 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71a5a4440a95f3e5edb3331ff9c962fb10246ca6a4d78e993445f1467450d51d"} err="failed to get container status \"71a5a4440a95f3e5edb3331ff9c962fb10246ca6a4d78e993445f1467450d51d\": rpc error: code = NotFound desc = could not find container \"71a5a4440a95f3e5edb3331ff9c962fb10246ca6a4d78e993445f1467450d51d\": container with ID starting with 71a5a4440a95f3e5edb3331ff9c962fb10246ca6a4d78e993445f1467450d51d not found: ID does not exist" Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.678453 4791 scope.go:117] "RemoveContainer" containerID="2e70432ea866312b83604e2c0311fc971f1a540d9eda873767043a0d54469a5a" Feb 17 00:32:35 crc kubenswrapper[4791]: E0217 00:32:35.678793 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e70432ea866312b83604e2c0311fc971f1a540d9eda873767043a0d54469a5a\": container with ID starting with 2e70432ea866312b83604e2c0311fc971f1a540d9eda873767043a0d54469a5a not found: ID does not exist" containerID="2e70432ea866312b83604e2c0311fc971f1a540d9eda873767043a0d54469a5a" Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.678829 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e70432ea866312b83604e2c0311fc971f1a540d9eda873767043a0d54469a5a"} err="failed to get container status \"2e70432ea866312b83604e2c0311fc971f1a540d9eda873767043a0d54469a5a\": rpc error: code = NotFound desc = could not find container \"2e70432ea866312b83604e2c0311fc971f1a540d9eda873767043a0d54469a5a\": container with ID starting with 2e70432ea866312b83604e2c0311fc971f1a540d9eda873767043a0d54469a5a not found: ID does not exist" Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.678849 4791 scope.go:117] "RemoveContainer" containerID="5ba0aaf55242c1e2fee9e95b6f19940cedef58731e631cb722505a8df02f3600" Feb 17 00:32:35 crc kubenswrapper[4791]: E0217 00:32:35.679145 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ba0aaf55242c1e2fee9e95b6f19940cedef58731e631cb722505a8df02f3600\": container with ID starting with 5ba0aaf55242c1e2fee9e95b6f19940cedef58731e631cb722505a8df02f3600 not found: ID does not exist" containerID="5ba0aaf55242c1e2fee9e95b6f19940cedef58731e631cb722505a8df02f3600" Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.679192 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ba0aaf55242c1e2fee9e95b6f19940cedef58731e631cb722505a8df02f3600"} err="failed to get container status \"5ba0aaf55242c1e2fee9e95b6f19940cedef58731e631cb722505a8df02f3600\": rpc error: code = NotFound desc = could not find container \"5ba0aaf55242c1e2fee9e95b6f19940cedef58731e631cb722505a8df02f3600\": container with ID starting with 5ba0aaf55242c1e2fee9e95b6f19940cedef58731e631cb722505a8df02f3600 not found: ID does not exist" Feb 17 00:32:37 crc kubenswrapper[4791]: I0217 00:32:37.236884 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec42b5a3-c15d-4fef-9370-86ae9da61992" path="/var/lib/kubelet/pods/ec42b5a3-c15d-4fef-9370-86ae9da61992/volumes" Feb 17 00:32:43 crc kubenswrapper[4791]: I0217 00:32:43.944013 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-9dsmn_bf759390-4034-42c9-811b-531aeabd3ed6/cert-manager-controller/0.log" Feb 17 00:32:44 crc kubenswrapper[4791]: I0217 00:32:44.091875 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-l9798_aca3c38d-0dd8-4457-854a-b392ba180087/cert-manager-cainjector/0.log" Feb 17 00:32:44 crc kubenswrapper[4791]: I0217 00:32:44.149561 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-42zhs_4b26c415-6a42-4bda-abbd-cf394bc94043/cert-manager-webhook/0.log" Feb 17 00:32:54 crc kubenswrapper[4791]: I0217 00:32:54.973062 4791 patch_prober.go:28] interesting pod/machine-config-daemon-9klkw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 00:32:54 crc kubenswrapper[4791]: I0217 00:32:54.974311 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 00:32:58 crc kubenswrapper[4791]: I0217 00:32:58.328362 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-rw6pj_f73f7b40-6611-465e-ae69-d2f70ce77651/prometheus-operator/0.log" Feb 17 00:32:58 crc kubenswrapper[4791]: I0217 00:32:58.448570 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7b4b5fc6bd-6pl7f_b046e97f-6343-4e3f-ae0a-0fb40687d992/prometheus-operator-admission-webhook/0.log" Feb 17 00:32:58 crc kubenswrapper[4791]: I0217 00:32:58.507206 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7b4b5fc6bd-drxh5_8c43370f-07b8-4f84-b716-34af90be5850/prometheus-operator-admission-webhook/0.log" Feb 17 00:32:58 crc kubenswrapper[4791]: I0217 00:32:58.636374 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-v2lwp_307585d5-5ed8-43df-b5d8-977729339610/operator/0.log" Feb 17 00:32:58 crc kubenswrapper[4791]: I0217 00:32:58.686031 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-mk4lp_3b110234-d36d-4ced-a2be-7913bbb84d2a/perses-operator/0.log" Feb 17 00:33:13 crc kubenswrapper[4791]: I0217 00:33:13.042646 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb_f156dae1-1d4a-47b3-835e-016325f1981c/util/0.log" Feb 17 00:33:13 crc kubenswrapper[4791]: I0217 00:33:13.187693 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb_f156dae1-1d4a-47b3-835e-016325f1981c/pull/0.log" Feb 17 00:33:13 crc kubenswrapper[4791]: I0217 00:33:13.197424 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb_f156dae1-1d4a-47b3-835e-016325f1981c/pull/0.log" Feb 17 00:33:13 crc kubenswrapper[4791]: I0217 00:33:13.208518 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb_f156dae1-1d4a-47b3-835e-016325f1981c/util/0.log" Feb 17 00:33:13 crc kubenswrapper[4791]: I0217 00:33:13.397332 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb_f156dae1-1d4a-47b3-835e-016325f1981c/pull/0.log" Feb 17 00:33:13 crc kubenswrapper[4791]: I0217 00:33:13.400784 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb_f156dae1-1d4a-47b3-835e-016325f1981c/util/0.log" Feb 17 00:33:13 crc kubenswrapper[4791]: I0217 00:33:13.412905 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb_f156dae1-1d4a-47b3-835e-016325f1981c/extract/0.log" Feb 17 00:33:13 crc kubenswrapper[4791]: I0217 00:33:13.567224 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r_bfce146a-61fa-4821-ab43-8fd35dc5fe07/util/0.log" Feb 17 00:33:13 crc kubenswrapper[4791]: I0217 00:33:13.723811 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r_bfce146a-61fa-4821-ab43-8fd35dc5fe07/pull/0.log" Feb 17 00:33:13 crc kubenswrapper[4791]: I0217 00:33:13.738392 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r_bfce146a-61fa-4821-ab43-8fd35dc5fe07/util/0.log" Feb 17 00:33:13 crc kubenswrapper[4791]: I0217 00:33:13.764432 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r_bfce146a-61fa-4821-ab43-8fd35dc5fe07/pull/0.log" Feb 17 00:33:13 crc kubenswrapper[4791]: I0217 00:33:13.885828 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r_bfce146a-61fa-4821-ab43-8fd35dc5fe07/util/0.log" Feb 17 00:33:13 crc kubenswrapper[4791]: I0217 00:33:13.907824 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r_bfce146a-61fa-4821-ab43-8fd35dc5fe07/extract/0.log" Feb 17 00:33:13 crc kubenswrapper[4791]: I0217 00:33:13.933185 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r_bfce146a-61fa-4821-ab43-8fd35dc5fe07/pull/0.log" Feb 17 00:33:14 crc kubenswrapper[4791]: I0217 00:33:14.085960 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm_ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9/util/0.log" Feb 17 00:33:14 crc kubenswrapper[4791]: I0217 00:33:14.222717 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm_ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9/util/0.log" Feb 17 00:33:14 crc kubenswrapper[4791]: I0217 00:33:14.255696 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm_ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9/pull/0.log" Feb 17 00:33:14 crc kubenswrapper[4791]: I0217 00:33:14.280663 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm_ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9/pull/0.log" Feb 17 00:33:14 crc kubenswrapper[4791]: I0217 00:33:14.442167 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm_ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9/extract/0.log" Feb 17 00:33:14 crc kubenswrapper[4791]: I0217 00:33:14.450970 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm_ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9/util/0.log" Feb 17 00:33:14 crc kubenswrapper[4791]: I0217 00:33:14.462351 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm_ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9/pull/0.log" Feb 17 00:33:14 crc kubenswrapper[4791]: I0217 00:33:14.601278 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl_306a7321-68e3-4f13-95d0-3c3dbee8b24f/util/0.log" Feb 17 00:33:14 crc kubenswrapper[4791]: I0217 00:33:14.777991 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl_306a7321-68e3-4f13-95d0-3c3dbee8b24f/pull/0.log" Feb 17 00:33:14 crc kubenswrapper[4791]: I0217 00:33:14.793657 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl_306a7321-68e3-4f13-95d0-3c3dbee8b24f/util/0.log" Feb 17 00:33:14 crc kubenswrapper[4791]: I0217 00:33:14.808519 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl_306a7321-68e3-4f13-95d0-3c3dbee8b24f/pull/0.log" Feb 17 00:33:14 crc kubenswrapper[4791]: I0217 00:33:14.920771 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl_306a7321-68e3-4f13-95d0-3c3dbee8b24f/pull/0.log" Feb 17 00:33:14 crc kubenswrapper[4791]: I0217 00:33:14.949153 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl_306a7321-68e3-4f13-95d0-3c3dbee8b24f/util/0.log" Feb 17 00:33:14 crc kubenswrapper[4791]: I0217 00:33:14.964486 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl_306a7321-68e3-4f13-95d0-3c3dbee8b24f/extract/0.log" Feb 17 00:33:15 crc kubenswrapper[4791]: I0217 00:33:15.116250 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-npbnh_c6f055fb-42f4-4699-8dd3-d93710f92ec8/extract-utilities/0.log" Feb 17 00:33:15 crc kubenswrapper[4791]: I0217 00:33:15.247066 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-npbnh_c6f055fb-42f4-4699-8dd3-d93710f92ec8/extract-utilities/0.log" Feb 17 00:33:15 crc kubenswrapper[4791]: I0217 00:33:15.285459 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-npbnh_c6f055fb-42f4-4699-8dd3-d93710f92ec8/extract-content/0.log" Feb 17 00:33:15 crc kubenswrapper[4791]: I0217 00:33:15.285497 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-npbnh_c6f055fb-42f4-4699-8dd3-d93710f92ec8/extract-content/0.log" Feb 17 00:33:15 crc kubenswrapper[4791]: I0217 00:33:15.458296 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-npbnh_c6f055fb-42f4-4699-8dd3-d93710f92ec8/extract-utilities/0.log" Feb 17 00:33:15 crc kubenswrapper[4791]: I0217 00:33:15.500360 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-npbnh_c6f055fb-42f4-4699-8dd3-d93710f92ec8/extract-content/0.log" Feb 17 00:33:15 crc kubenswrapper[4791]: I0217 00:33:15.656660 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-npbnh_c6f055fb-42f4-4699-8dd3-d93710f92ec8/registry-server/0.log" Feb 17 00:33:15 crc kubenswrapper[4791]: I0217 00:33:15.674760 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-djrqd_79b4304a-5553-411d-a6df-e2af898a22b0/extract-utilities/0.log" Feb 17 00:33:15 crc kubenswrapper[4791]: I0217 00:33:15.862369 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-djrqd_79b4304a-5553-411d-a6df-e2af898a22b0/extract-content/0.log" Feb 17 00:33:15 crc kubenswrapper[4791]: I0217 00:33:15.865815 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-djrqd_79b4304a-5553-411d-a6df-e2af898a22b0/extract-utilities/0.log" Feb 17 00:33:15 crc kubenswrapper[4791]: I0217 00:33:15.887087 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-djrqd_79b4304a-5553-411d-a6df-e2af898a22b0/extract-content/0.log" Feb 17 00:33:16 crc kubenswrapper[4791]: I0217 00:33:16.007273 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-djrqd_79b4304a-5553-411d-a6df-e2af898a22b0/extract-content/0.log" Feb 17 00:33:16 crc kubenswrapper[4791]: I0217 00:33:16.069618 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-djrqd_79b4304a-5553-411d-a6df-e2af898a22b0/extract-utilities/0.log" Feb 17 00:33:16 crc kubenswrapper[4791]: I0217 00:33:16.218180 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-t8x7k_66e06ad0-6874-4a52-94d8-76da74f7336b/marketplace-operator/0.log" Feb 17 00:33:16 crc kubenswrapper[4791]: I0217 00:33:16.271279 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-djrqd_79b4304a-5553-411d-a6df-e2af898a22b0/registry-server/0.log" Feb 17 00:33:16 crc kubenswrapper[4791]: I0217 00:33:16.291280 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sv4n6_f9f068a6-ed4e-4080-a05b-40562b5e8711/extract-utilities/0.log" Feb 17 00:33:16 crc kubenswrapper[4791]: I0217 00:33:16.455514 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sv4n6_f9f068a6-ed4e-4080-a05b-40562b5e8711/extract-content/0.log" Feb 17 00:33:16 crc kubenswrapper[4791]: I0217 00:33:16.456198 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sv4n6_f9f068a6-ed4e-4080-a05b-40562b5e8711/extract-content/0.log" Feb 17 00:33:16 crc kubenswrapper[4791]: I0217 00:33:16.483159 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sv4n6_f9f068a6-ed4e-4080-a05b-40562b5e8711/extract-utilities/0.log" Feb 17 00:33:16 crc kubenswrapper[4791]: I0217 00:33:16.627026 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sv4n6_f9f068a6-ed4e-4080-a05b-40562b5e8711/extract-utilities/0.log" Feb 17 00:33:16 crc kubenswrapper[4791]: I0217 00:33:16.630788 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sv4n6_f9f068a6-ed4e-4080-a05b-40562b5e8711/extract-content/0.log" Feb 17 00:33:16 crc kubenswrapper[4791]: I0217 00:33:16.929530 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sv4n6_f9f068a6-ed4e-4080-a05b-40562b5e8711/registry-server/0.log" Feb 17 00:33:24 crc kubenswrapper[4791]: I0217 00:33:24.973865 4791 patch_prober.go:28] interesting pod/machine-config-daemon-9klkw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 00:33:24 crc kubenswrapper[4791]: I0217 00:33:24.974646 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 00:33:29 crc kubenswrapper[4791]: I0217 00:33:29.622583 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7b4b5fc6bd-6pl7f_b046e97f-6343-4e3f-ae0a-0fb40687d992/prometheus-operator-admission-webhook/0.log" Feb 17 00:33:29 crc kubenswrapper[4791]: I0217 00:33:29.655483 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7b4b5fc6bd-drxh5_8c43370f-07b8-4f84-b716-34af90be5850/prometheus-operator-admission-webhook/0.log" Feb 17 00:33:29 crc kubenswrapper[4791]: I0217 00:33:29.658367 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-rw6pj_f73f7b40-6611-465e-ae69-d2f70ce77651/prometheus-operator/0.log" Feb 17 00:33:29 crc kubenswrapper[4791]: I0217 00:33:29.747596 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-v2lwp_307585d5-5ed8-43df-b5d8-977729339610/operator/0.log" Feb 17 00:33:29 crc kubenswrapper[4791]: I0217 00:33:29.814407 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-mk4lp_3b110234-d36d-4ced-a2be-7913bbb84d2a/perses-operator/0.log" Feb 17 00:33:54 crc kubenswrapper[4791]: I0217 00:33:54.972579 4791 patch_prober.go:28] interesting pod/machine-config-daemon-9klkw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 00:33:54 crc kubenswrapper[4791]: I0217 00:33:54.973131 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 00:33:54 crc kubenswrapper[4791]: I0217 00:33:54.973181 4791 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" Feb 17 00:33:54 crc kubenswrapper[4791]: I0217 00:33:54.973725 4791 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"83a45c14446dd0994ae78aba5b94b4303c07e73628e6156002bb2f0eb1460104"} pod="openshift-machine-config-operator/machine-config-daemon-9klkw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 00:33:54 crc kubenswrapper[4791]: I0217 00:33:54.973781 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" containerID="cri-o://83a45c14446dd0994ae78aba5b94b4303c07e73628e6156002bb2f0eb1460104" gracePeriod=600 Feb 17 00:33:55 crc kubenswrapper[4791]: I0217 00:33:55.264099 4791 generic.go:334] "Generic (PLEG): container finished" podID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerID="83a45c14446dd0994ae78aba5b94b4303c07e73628e6156002bb2f0eb1460104" exitCode=0 Feb 17 00:33:55 crc kubenswrapper[4791]: I0217 00:33:55.264152 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" event={"ID":"02a3a228-86d6-4d54-ad63-0d36c9d59af5","Type":"ContainerDied","Data":"83a45c14446dd0994ae78aba5b94b4303c07e73628e6156002bb2f0eb1460104"} Feb 17 00:33:55 crc kubenswrapper[4791]: I0217 00:33:55.264579 4791 scope.go:117] "RemoveContainer" containerID="0ab0ea25cf5596643285f502958b16b41d67465e6ae2b0bd26ff7c3c0dfd7a57" Feb 17 00:33:55 crc kubenswrapper[4791]: E0217 00:33:55.606547 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9klkw_openshift-machine-config-operator(02a3a228-86d6-4d54-ad63-0d36c9d59af5)\"" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" Feb 17 00:33:56 crc kubenswrapper[4791]: I0217 00:33:56.277446 4791 scope.go:117] "RemoveContainer" containerID="83a45c14446dd0994ae78aba5b94b4303c07e73628e6156002bb2f0eb1460104" Feb 17 00:33:56 crc kubenswrapper[4791]: E0217 00:33:56.277814 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9klkw_openshift-machine-config-operator(02a3a228-86d6-4d54-ad63-0d36c9d59af5)\"" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" Feb 17 00:34:07 crc kubenswrapper[4791]: I0217 00:34:07.224416 4791 scope.go:117] "RemoveContainer" containerID="83a45c14446dd0994ae78aba5b94b4303c07e73628e6156002bb2f0eb1460104" Feb 17 00:34:07 crc kubenswrapper[4791]: E0217 00:34:07.225195 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9klkw_openshift-machine-config-operator(02a3a228-86d6-4d54-ad63-0d36c9d59af5)\"" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" Feb 17 00:34:19 crc kubenswrapper[4791]: I0217 00:34:19.538689 4791 generic.go:334] "Generic (PLEG): container finished" podID="240269b5-7b03-4e43-8d40-106c95b85777" containerID="4aacef8b7dbadb146ac970bd0266869527a6bdc9bd8084d697396cb1fdecc57d" exitCode=0 Feb 17 00:34:19 crc kubenswrapper[4791]: I0217 00:34:19.539281 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-glgdj/must-gather-f94nt" event={"ID":"240269b5-7b03-4e43-8d40-106c95b85777","Type":"ContainerDied","Data":"4aacef8b7dbadb146ac970bd0266869527a6bdc9bd8084d697396cb1fdecc57d"} Feb 17 00:34:19 crc kubenswrapper[4791]: I0217 00:34:19.540216 4791 scope.go:117] "RemoveContainer" containerID="4aacef8b7dbadb146ac970bd0266869527a6bdc9bd8084d697396cb1fdecc57d" Feb 17 00:34:20 crc kubenswrapper[4791]: I0217 00:34:20.220895 4791 scope.go:117] "RemoveContainer" containerID="83a45c14446dd0994ae78aba5b94b4303c07e73628e6156002bb2f0eb1460104" Feb 17 00:34:20 crc kubenswrapper[4791]: E0217 00:34:20.221460 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9klkw_openshift-machine-config-operator(02a3a228-86d6-4d54-ad63-0d36c9d59af5)\"" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" Feb 17 00:34:20 crc kubenswrapper[4791]: I0217 00:34:20.344366 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-glgdj_must-gather-f94nt_240269b5-7b03-4e43-8d40-106c95b85777/gather/0.log" Feb 17 00:34:27 crc kubenswrapper[4791]: I0217 00:34:27.468084 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-glgdj/must-gather-f94nt"] Feb 17 00:34:27 crc kubenswrapper[4791]: I0217 00:34:27.470773 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-glgdj/must-gather-f94nt" podUID="240269b5-7b03-4e43-8d40-106c95b85777" containerName="copy" containerID="cri-o://7e570ede014f3b62e7289481867c7b647495eebf2f846f5e59e0597dd23ce004" gracePeriod=2 Feb 17 00:34:27 crc kubenswrapper[4791]: I0217 00:34:27.478703 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-glgdj/must-gather-f94nt"] Feb 17 00:34:27 crc kubenswrapper[4791]: I0217 00:34:27.618742 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-glgdj_must-gather-f94nt_240269b5-7b03-4e43-8d40-106c95b85777/copy/0.log" Feb 17 00:34:27 crc kubenswrapper[4791]: I0217 00:34:27.619606 4791 generic.go:334] "Generic (PLEG): container finished" podID="240269b5-7b03-4e43-8d40-106c95b85777" containerID="7e570ede014f3b62e7289481867c7b647495eebf2f846f5e59e0597dd23ce004" exitCode=143 Feb 17 00:34:27 crc kubenswrapper[4791]: I0217 00:34:27.854510 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-glgdj_must-gather-f94nt_240269b5-7b03-4e43-8d40-106c95b85777/copy/0.log" Feb 17 00:34:27 crc kubenswrapper[4791]: I0217 00:34:27.855100 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-glgdj/must-gather-f94nt" Feb 17 00:34:27 crc kubenswrapper[4791]: I0217 00:34:27.980080 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/240269b5-7b03-4e43-8d40-106c95b85777-must-gather-output\") pod \"240269b5-7b03-4e43-8d40-106c95b85777\" (UID: \"240269b5-7b03-4e43-8d40-106c95b85777\") " Feb 17 00:34:27 crc kubenswrapper[4791]: I0217 00:34:27.980135 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4x67\" (UniqueName: \"kubernetes.io/projected/240269b5-7b03-4e43-8d40-106c95b85777-kube-api-access-p4x67\") pod \"240269b5-7b03-4e43-8d40-106c95b85777\" (UID: \"240269b5-7b03-4e43-8d40-106c95b85777\") " Feb 17 00:34:27 crc kubenswrapper[4791]: I0217 00:34:27.996233 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/240269b5-7b03-4e43-8d40-106c95b85777-kube-api-access-p4x67" (OuterVolumeSpecName: "kube-api-access-p4x67") pod "240269b5-7b03-4e43-8d40-106c95b85777" (UID: "240269b5-7b03-4e43-8d40-106c95b85777"). InnerVolumeSpecName "kube-api-access-p4x67". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:34:28 crc kubenswrapper[4791]: I0217 00:34:28.033800 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/240269b5-7b03-4e43-8d40-106c95b85777-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "240269b5-7b03-4e43-8d40-106c95b85777" (UID: "240269b5-7b03-4e43-8d40-106c95b85777"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:34:28 crc kubenswrapper[4791]: I0217 00:34:28.082067 4791 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/240269b5-7b03-4e43-8d40-106c95b85777-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 17 00:34:28 crc kubenswrapper[4791]: I0217 00:34:28.082458 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4x67\" (UniqueName: \"kubernetes.io/projected/240269b5-7b03-4e43-8d40-106c95b85777-kube-api-access-p4x67\") on node \"crc\" DevicePath \"\"" Feb 17 00:34:28 crc kubenswrapper[4791]: I0217 00:34:28.630424 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-glgdj_must-gather-f94nt_240269b5-7b03-4e43-8d40-106c95b85777/copy/0.log" Feb 17 00:34:28 crc kubenswrapper[4791]: I0217 00:34:28.630831 4791 scope.go:117] "RemoveContainer" containerID="7e570ede014f3b62e7289481867c7b647495eebf2f846f5e59e0597dd23ce004" Feb 17 00:34:28 crc kubenswrapper[4791]: I0217 00:34:28.630923 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-glgdj/must-gather-f94nt" Feb 17 00:34:28 crc kubenswrapper[4791]: I0217 00:34:28.654234 4791 scope.go:117] "RemoveContainer" containerID="4aacef8b7dbadb146ac970bd0266869527a6bdc9bd8084d697396cb1fdecc57d" Feb 17 00:34:29 crc kubenswrapper[4791]: I0217 00:34:29.228911 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="240269b5-7b03-4e43-8d40-106c95b85777" path="/var/lib/kubelet/pods/240269b5-7b03-4e43-8d40-106c95b85777/volumes" Feb 17 00:34:31 crc kubenswrapper[4791]: I0217 00:34:31.221636 4791 scope.go:117] "RemoveContainer" containerID="83a45c14446dd0994ae78aba5b94b4303c07e73628e6156002bb2f0eb1460104" Feb 17 00:34:31 crc kubenswrapper[4791]: E0217 00:34:31.222551 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9klkw_openshift-machine-config-operator(02a3a228-86d6-4d54-ad63-0d36c9d59af5)\"" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" Feb 17 00:34:45 crc kubenswrapper[4791]: I0217 00:34:45.221205 4791 scope.go:117] "RemoveContainer" containerID="83a45c14446dd0994ae78aba5b94b4303c07e73628e6156002bb2f0eb1460104" Feb 17 00:34:45 crc kubenswrapper[4791]: E0217 00:34:45.222329 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9klkw_openshift-machine-config-operator(02a3a228-86d6-4d54-ad63-0d36c9d59af5)\"" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" Feb 17 00:35:00 crc kubenswrapper[4791]: I0217 00:35:00.220534 4791 scope.go:117] "RemoveContainer" containerID="83a45c14446dd0994ae78aba5b94b4303c07e73628e6156002bb2f0eb1460104" Feb 17 00:35:00 crc kubenswrapper[4791]: E0217 00:35:00.221099 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9klkw_openshift-machine-config-operator(02a3a228-86d6-4d54-ad63-0d36c9d59af5)\"" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" Feb 17 00:35:14 crc kubenswrapper[4791]: I0217 00:35:14.220840 4791 scope.go:117] "RemoveContainer" containerID="83a45c14446dd0994ae78aba5b94b4303c07e73628e6156002bb2f0eb1460104" Feb 17 00:35:14 crc kubenswrapper[4791]: E0217 00:35:14.222255 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9klkw_openshift-machine-config-operator(02a3a228-86d6-4d54-ad63-0d36c9d59af5)\"" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" Feb 17 00:35:28 crc kubenswrapper[4791]: I0217 00:35:28.221591 4791 scope.go:117] "RemoveContainer" containerID="83a45c14446dd0994ae78aba5b94b4303c07e73628e6156002bb2f0eb1460104" Feb 17 00:35:28 crc kubenswrapper[4791]: E0217 00:35:28.222655 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9klkw_openshift-machine-config-operator(02a3a228-86d6-4d54-ad63-0d36c9d59af5)\"" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" Feb 17 00:35:40 crc kubenswrapper[4791]: I0217 00:35:40.220094 4791 scope.go:117] "RemoveContainer" containerID="83a45c14446dd0994ae78aba5b94b4303c07e73628e6156002bb2f0eb1460104" Feb 17 00:35:40 crc kubenswrapper[4791]: E0217 00:35:40.221003 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9klkw_openshift-machine-config-operator(02a3a228-86d6-4d54-ad63-0d36c9d59af5)\"" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" Feb 17 00:35:54 crc kubenswrapper[4791]: I0217 00:35:54.220960 4791 scope.go:117] "RemoveContainer" containerID="83a45c14446dd0994ae78aba5b94b4303c07e73628e6156002bb2f0eb1460104" Feb 17 00:35:54 crc kubenswrapper[4791]: E0217 00:35:54.222035 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9klkw_openshift-machine-config-operator(02a3a228-86d6-4d54-ad63-0d36c9d59af5)\"" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" Feb 17 00:36:05 crc kubenswrapper[4791]: I0217 00:36:05.220478 4791 scope.go:117] "RemoveContainer" containerID="83a45c14446dd0994ae78aba5b94b4303c07e73628e6156002bb2f0eb1460104" Feb 17 00:36:05 crc kubenswrapper[4791]: E0217 00:36:05.221436 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9klkw_openshift-machine-config-operator(02a3a228-86d6-4d54-ad63-0d36c9d59af5)\"" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" Feb 17 00:36:17 crc kubenswrapper[4791]: I0217 00:36:17.220797 4791 scope.go:117] "RemoveContainer" containerID="83a45c14446dd0994ae78aba5b94b4303c07e73628e6156002bb2f0eb1460104" Feb 17 00:36:17 crc kubenswrapper[4791]: E0217 00:36:17.222033 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9klkw_openshift-machine-config-operator(02a3a228-86d6-4d54-ad63-0d36c9d59af5)\"" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" Feb 17 00:36:30 crc kubenswrapper[4791]: I0217 00:36:30.220997 4791 scope.go:117] "RemoveContainer" containerID="83a45c14446dd0994ae78aba5b94b4303c07e73628e6156002bb2f0eb1460104" Feb 17 00:36:30 crc kubenswrapper[4791]: E0217 00:36:30.222158 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9klkw_openshift-machine-config-operator(02a3a228-86d6-4d54-ad63-0d36c9d59af5)\"" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" Feb 17 00:36:43 crc kubenswrapper[4791]: I0217 00:36:43.227896 4791 scope.go:117] "RemoveContainer" containerID="83a45c14446dd0994ae78aba5b94b4303c07e73628e6156002bb2f0eb1460104" Feb 17 00:36:43 crc kubenswrapper[4791]: E0217 00:36:43.228927 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9klkw_openshift-machine-config-operator(02a3a228-86d6-4d54-ad63-0d36c9d59af5)\"" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515144734046024455 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015144734047017373 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015144730007016506 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015144730007015456 5ustar corecore